Short Form Content, The Algorithm, and Us

A Change To Us

Smartphones have already effectively made us cyborgs. As people, it has gone further than us being expected to have our phones on us to use as a tool, but now we expect humans to be capable of doing what smartphones can do at all times. For the first time in history, I can request someone send me a picture of something and receive a response within seconds no matter where they are in the world. There is no consideration of how they send the picture, everyone I know is now able to send pictures, and I can expect this from them and everyone new I meet. Phones also act as a limb. Not having my phone means I am missing something that is a part of me, and there is separation anxiety associated with not having a phone on my person. That ability of instant global communication is lost. Losing your vocal cords deprives you of the ability to speak in at a personal range in the same way that losing your phone deprives you of the expected ability to communicate instantly at a global range. You are left unable to complete an expected human function.

We have integrated the phone not just as a tool for work or entertainment, but absolutely. Phones have triggered a new relationship with information. Humans now have the ability of information recollection with nearly no delay, changing our dynamic with thinking itself. We no longer need to worry about fact recollection, including facts that we don’t already know. This lets us think exclusively in more dynamic processes and connections. This effectively means that the smartphone acts not only externally as a limb, but also internally as a part of our brains.

The changes don’t stop in the personal. Previous generations used to need to see people to have a relationship with them. There were some pen pals, but this was usually only after meeting or reserved for unusual cases, like in fame or career. Now, it is normal for people to have multiple friends that they have never seen but only know online. These people are met in online spaces, talk in online spaces, and play in online spaces. People no longer need to physically go somewhere to meet or see people. While this has increased the number of potential friends we can have and facilitated faster connections to friends, it isn’t all good. This could be an entire project in itself, but just to mention a couple of things, this format stunts trust and depth of connection. When not confronted face to face, it is easy to act much more hostile without consideration of the other person’s response. A great example of everything that could be talked about here is breaking up over text. You are clearly close to the person; you are communicating with them, but it is still so impersonal that is leaps well past the boundary of bad manners into the realm of awful. If online relationships were the same as in-person relationships, it shouldn’t be that out of line.

Additionally, in time spent alone, people spend their free time engaged with online content. After spending a day on computers at work, to relax and be social, we rely on computers in a different context. Social media often replaces group experiences with group chats, messaging, or calls. If we don’t want to be directly social, we scroll through social media that provide us with content that we engage with socially. We see jokes or messages from others that find their way to our pages and host discussions based on those posts. Like-minded people find communities, and those communities have shared experiences while changing dynamically as their users do.

Parasocial relationships also permeate the social experience. YouTubers, streamers, and podcasters all create a type of social relationship that is predicated on one side making money from an almost friendship of the other side. This dynamic turns friendship and enjoyment into content consumption, which is transactional. One side needs the other to make money, the other side knows this and engages because it feels personal despite the money-making side not knowing them individually. One side is individual, and the other is a group of unknown. There are arguments that this is weird, mostly older people seeing young people paying to engage with others online. There are arguments that this is normal, that it is the new equivalent of paying to watch TV. Either way, the constant portable accessibility to normalized parasocial relationships is a new social dynamic available only in recent years.

All of this goes towards giving us a new social experience. The way we interact with ourselves and others has changed. The number of avenues for different types of social experiences has increased, and the number of specific experiences possible in each of those avenues has increased dramatically alongside it. This change has happened rapidly and had many impacts beyond what people originally expected as they introduced the new technology.

Changing The Environment That Changes Us

As social beings, this change to the human experience of self and others has sweeping impacts. As we become more dependent on social media, our social experience gets heavily impacted by the social media algorithm and what it expects us to like. This is effectively the social media experience, ie the new social experience. Interactions demand an environment of interaction. Where before it could be the weather, where you meet, or any other spatial aspect, now the environment is that of the platform that mediates the social experience. This platform is the same for anyone anywhere. Communication moves from spatial and global, to personal to impersonal.

This is reliant on the digital matrix. We develop further capabilities for digital tools, increasing how effective they are as tools, and in turn, increasing our reliance on them. This is not new, it happens with just about any common tool throughout history, but what is new is the limitation of the new development. Older tools, say a hammer, became popular, improvements were made to form and material, and they became more popular as they became better. With technology, development changes. Phones are no longer limited to using their form or material to dictate our new interaction with them. This instead is augmented through software. More tools become accessible through the same tool, and the importance of the physical tool is diminished in the eyes of the user. In the information age, people are often valued more for their intellectual capabilities rather than their physical, which allows this new model of evolution to thrive. Our physical bodies seem limited, but physical augmentation has proven to have massive potential. Our minds seem much more elastic, but thus far, mental augmentation has not proven very effective.

So if we mould computers into the all-encompassing tool for modern life, and those tools in return have moulded us, and so on and so forth, where does that leave our current evolution? The most recent major evolution seems to be short-form content. With the explosion of Tik Tok onto the scene, short-form content became the new way to engage with content online. All social media platforms have now adopted their own hosting of short-form content. This style of content consumption is also interesting for how much it surrenders to being a raw and unfiltered interaction with the intentions of the algorithm. This gives the algorithm massive amounts of power to adopt the human experience, as it effectively takes choice out of consumption.

Short Form Content

Short-form content is the purest interaction with the algorithm we have, and it has gained success, permeating all social media spaces.

Before short-form content, the algorithm would feed users content that they could choose to engage with or not. Feeds would provide snapshots of content: a thumbnail and title of a video, the first few lines of a text post, or the title of a website. Users would choose what content they wanted to engage with based on the information it gave them and select to engage with its full content. This model has persisted for a very long time. It is very reminiscent of the newspapers, where people could read the headlines and then decide whether to further engage with the content. While its evolution has led to some flaws in the online space, with clickbait becoming almost necessary to draw any attention, it was still a structure that reinforced some semblance of choice for the end user.

Now, in short-form content, the algorithm provides content without regard for the choice of the end user. If the content you receive is not the content you desire, your only real choice is to scroll to accept the next one. Instead of selecting different content, you hope that the algorithm can select better content for you using statistics based on your usage. The algorithm is then able to classify you based on your interests and create a model of you and your desires to recommend content you will like in the future. If it is successful, this model of content delivery is extremely addictive. To the benefit of those who own enough data on users, the algorithms are incredibly effective.

To be successful, the algorithm needs to know what people want before they want it. To achieve this, it uses user statistics based on big data. Data are taken in that help create profiles on all the users based on their interaction and retention rates on all posts. From here, the algorithm is fine-tuned to deliver what they want based on their consumption. This can be directly or through shared common interests. As the velocity of data increases, so do results. The more data acquired on people with very specific interests, the more the algorithm will learn how to appeal to people with those specific interests. With millions of users consuming nearly constantly, the algorithm can become extremely effective and extremely addicting. Users of Tik Tok cite this as the appeal of the app. Within an hour of starting, it is already recommending them content that they love. So many people use it, making it incredibly effective at locking you into what you want to see near instantly. From here, addiction makes it difficult to leave without feeling like you’re missing out.

Big data can work so effectively in this space because it is results-based. It does not matter necessarily what exactly people want at that moment as long as overall they will continue to engage and consume. It does not matter that there is a human on the other side of the screen, what matters is there is a consumer that can be appealed to. From this end result, profit can be derived. This can happen through a few avenues, the most important being advertising, having a user consume a product based on content seen, and the other being data selling. Other companies also want to have more end-user engagement, so data can be bundled and sold to these other companies. The purchasing companies can then use the data to try to target people who are most likely to consume their product. In terms of corporate interests, both parties benefit from the mass harvesting of this data, and the economy is doing what it needs to: drawing more consumption. Purchasing the data is a transaction that adds to the economy, and the data itself can draw more consumption that adds to the economy. The ultimate economic win-win. This is great for everyone.

This is Bad For The People

Mental health is a known issue with social media.

The goal of a content creator is to get someone’s attention and then hopefully retain it. We already know from trends like clickbait that this model creates a weird way to generate initial interest. In short-form content, this seems to be through being loud and in your face, often alongside very quick cuts to keep the stimulus high. After this, the challenge becomes holding people for as long as possible. This is often so intense that it feels like an assault on the senses. Bright images, loud sounds, high emotion, and a constant feed of more in case one isn’t good enough so the next can try its shot at grabbing attention. The first few milliseconds are everything, and the first few milliseconds don’t give a person enough time to realize what they are watching. It is all about gaming impulses.

This is to say that the algorithm is not made to be good for people. It is made to game the algorithm, which runs off of data that is a results-based indicator of retention, not quality time. There is creation and consumption, but it exists in a post-human state. Making something that people like is not the goal of content creation, but making content that becomes rewarded by the algorithm and fed to people is. Instead of creating something informative and fun, the aim is to get the exact amount of time watched to get the algorithm to start putting the video in other people’s feeds. You can hear this in music as well once short-form content started blowing up. There are compelling songs and there are songs designed to have a specifically catchy or relatable 10-second sound-byte to be extracted and played over media. The song is not made to be listened to, it is made to be fed back into the algorithm to generate more streams. This also shows a different trend of people copying what they saw, given to them by the algorithm, then copying it only to put it back into the social media space where the initial content came from. This generally comes with the goal of wanting to blow up on social media, read as someone wanting the algorithm to recommend them to others for doing something that was recommended to them.

The human experience is no longer the focus of the engagement, the core of activity surrounds an appeal to the algorithm, with people only being a secondary thought. We are effectively lab rats in a digital space, given minimal input that tests for our specific reactions, adjusts slightly, then does so again. Short-form content is akin to a skinner box built by social media companies where we swipe a screen and get rewarded with video after video. Even well after the reward is clearly gone, and we no longer feel entertained by the content we continue to swipe a screen over and over for the hope of the feeling coming back.

There is a major cost to this rapid-fire inhuman production and consumption in mental health. I feel safe in saying that everyone is aware that short-form content is horrible for their mental health. Most people I know frequently talk about trying to quit again due to how bad their mental health deteriorates from its consistent use, but its addictive nature always brings them back. Mark Fisher has an analysis of depression where he states that it does not stem from a lack of stimulation, but rather when there is too much stimulus. Another analysis of depression states that the opposite of depression isn’t happiness but connection. If these hold water, short-form content is an incredibly efficient depression-producing machine optimized almost perfectly to crush mental health. Short-form content is not only multiple different stimulating videos back to back with minimal input between them, but each video is often multiple extremely stimulating. This is incredibly stimulating, killing our reception for the less stimulating real. Additionally, since social media consumption has replaced much of our social lives, more addicting social media serves to isolate us further. We shift away from real communication to an appeal to the algorithm. This is not simply a transition away, but a hostile takeover. The short-form content algorithm is designed to maximize consumption. This means more time spent on screen, which means less time spent off-screen. The goal of the algorithm is effectively to replace real-world relationships with digital parasocial relationships and communities. Community is transferred from real people you do know to digital people you often don’t know.

The question now becomes why would the algorithm be so set on doing something to destroy mental health. The answer to this comes back to the motive behind all social media companies: profit. These algorithms are not designed to push content that is good for people. It could be, but this would garner less engagement, which draws fewer user statistics, which is worth less grouped as data, which lowers profits. As long as the algorithm is designed to generate more profit, it will use any trick it can to ensure people stay addicted. Now, with every platform having its own short-form content, it is nearly impossible to avoid. Whether you accidentally click on it once or see something cool that draws you to it, it is designed to be as easy as possible to drag you in and keep you watching. There is no regard for the human on the business end, just the consumer.

The Sludge

We moulded social media by adding short-form content. Short-form content, after its early chaos, has now moulded a way to capture people even when more intense stimulation is no longer enough.

Sludge-form content is best loosely defined but widely recognized. On one part of the screen is an audible video about something just barely interesting: a comfortable show like Family Guy, a story from Reddit, or a podcast like Joe Rogan. A second video, unrelated content that comes with some satisfaction but no real engaging theme: Subway Surfers, foam squishes, soap cutting, Trackmania, Minecraft parkour, or a physics-based car simulation on a course. Lastly, there is often some simple and recognizable audio, like the sigma grindset song. This all comes together to make content that isn’t worth watching. There is nothing that truly demands your attention. There is nothing to learn. Nothing unexpected happens, and there are no high-octane moments. It is just easy to watch, maybe even multiple times, before swiping again to roll the dice on what comes next.

Sludge form content is not completely new, the concept is easily modelled after our lives. The rising rate of dopamine hits necessary to maintain a state of entertainment in young people’s brains has already caused us to sludge ourselves. If sludge form content at its core is having multiple forms of media presented to you, then its existence from one source is a mere convenience rather than anything truly new. Whenever we do something that is considered not thoroughly engaging, we listen to music or podcasts. With phone games, videos, and social media scrolling, we entertain ourselves in times of static waiting. People already throw on TV or YouTube and then pull out their phones to watch other content. One form of content may as well be able to provide that entire experience, it is only a natural progression. This would also cause sludge content to provide an experience of familiarity. If when we are relaxed we allow ourselves multiple forms of content at the same time, having multiple forms of content at the same time might be able to trick our brain into providing a feeling of relaxation.

It could also be that where previously we needed to have some form of connection to the digital matrix to feel safe or comfortable, we introduced our brains to a near-constant form of multitasking, which is now expected to feel that same comfort. Before, we had our real existence and our digital existence, both of which have to constantly exist at all times. Now, we need multiple points of contact to the digital in order to satisfy our needs. The constant state of multitasking between the real and the digital has expanded into people craving multitasking while in the digital. Instead of doing something real while listening to something digital, sludge form content is consumed alone, no longer relying on having alternative distractions, so it offers us multitasking all on one platform. The real no longer needs to be relied on, we are allowed to overindulge in the digital without any immediate negative reaction. In the same way that someone may find comfort in eating too much candy here and there, sludge form content allows us to consume too much media at once. Where eating too much candy may cause immediate physical negative effects, sludge form content comes with longer-term negative mental effects. The feeling of being able to freely indulge ourselves without judgement comes with a purely internal negative externality.

Short Form Rest Stops

Sludge form content is the algorithm’s way to give temporary relaxation from the assault of stimulus that is itself while also providing even more stimulation. Less intensity, more volume. You can take your break from short-form content in the form of a new short-form content: one that doesn’t demand you engage in consuming it, as long as you are still consuming it. It seems to have taken the space by storm, but not without recognition. Many people question why it’s there, but still manage to notice how they’ve watched the video multiple times and can’t explain why. It manages to fascinate without standing out too much, creating a new form of consumption of the same product. The consumers must abide by the algorithm and what it feeds them, but now there is a new style of presentation.

There is something widely discussed in the comment sections, or at least there was when I first saw short-form content. Most people were able to recognize it as post-human content. Something that does nothing for anyone, but somehow still manages to drive extended engagement and often multiple rewatches. Noone finds it interesting, yet it seems to captivate the brain just enough to consume it more. This engagement means that the algorithm has been able to pick it up and distribute it to people as a new form of extremely popular content, yet no one likes it. It is designed to be rewarded by the algorithm. It copies in hopes to be rewarded by the algorithm. It is not burdened by a commitment to actually providing what the end user wants from a platform. With it being so easy to mass produce, almost certainly being made with bots, we arrive at another layer of the post-human interaction of short-form content. Bots produce for the algorithm, and the algorithm distributes. Bots know what had a higher engagement, prompting them to produce more. Still, humans engage in this environment, and it continues to mould us.

With its lack of aggressive stimulus or calls to action, it seems that short-form content can provide rest in the blitz of stimulus. Every individual piece is low energy in itself, and all pieces generally come together without breaking that low energy state. They are all also low commitment, if one isn’t the right kind of stimulus for a second, the consumer can switch to another without any action on their part. You won’t have missed anything important by changing focus. Still, as a whole, it may be just satisfying enough to get the user to watch the video again just to double-check the other parts to ease their mind. Especially in the case of satisfying or ASMR videos, these can be watched multiple times with the same result. They do their job, being visually or audibly pleasing without breaking the hypnotic state. Since there is never a change from the expected, multiple rewatches change nothing. The user is able to just relax.

So short form is a stressor, but now short form is also a relief. To consume short-form content is now to live in a submitted state of relying on an algorithm to supply you with just the right amount of stimulus as well as just the right amount of relaxation to drive the user to be able to consume more content. As both the stress and relief are part of the same entity, you end up in a relationship with content that is oddly like an abusive relationship. That which harms you is the one that comforts, only to later harm again, addicted to the source of the harm. With it being of a relationship with the algorithm instead of a human, it leaves us in a breakdown of reality where we are social beings in an antisocial relationship, looking for the comfort of a relationship of some kind, only finding depressing hyper-stimulus that we cannot escape due to socially normalized addiction. This addiction is necessary for businesses as it generates data and profits.

Seeing as the profits will be chased, this is also oddly reminiscent of the market cycle. The market has a bipolar boom and bust cycle, giving euphoric highs followed by crippling depressions. These depressions bring reforms and the promise of better to keep us invested in the next euphoric boom. Short-form content issues us booms of stimulus with some content we can share with friends, followed by a depressive real-life state where we are under-stimulated. Addiction brings us back to short-form content, where it refers us to sludge content to prime us to go back into the next doom scroll. This keeps people addicted, which is great for deriving more interaction. These moments of euphoria and great share-worthy content, just like market booms, keep us addicted, as we want that feeling again.

Panopticons And Power Dynamics

Companies have control over algorithms in the grand scheme of things, but when it comes down to the end user, users have surrendered all control to the algorithm. They can tune the algorithm to their liking, but they can’t evade it and still consume short-form content. In the end, no matter what tinkering they will do, there will be a core of the algorithm that is prioritizing profit above all else, and it will find profits wherever it can. The algorithm knows us well, but we don’t know it. We just receive the content at the end and don’t think too much about it. Naturally, this makes me think of the panopticon.

First, with the acknowledgement that big data can detect about us even what we don’t recognize about ourselves, big data goes in the center tower. This is the true source of control: what is all-knowing and what monitors us constantly. Outside would be not necessarily us, but our souls. As we scroll through the short-form content, there is very simple information that can be gathered. Our retention, speed of swipe, how often we go back, and information on the video including who made it, what its contents are, and other statistics like engagement. The algorithm can provide us with insights inspired by big data. Based on how we respond, big data doesn’t just learn what we like or want to see, but it gleans more information than we can immediately know about ourselves. We may not have wanted to watch a certain video two or three times, we may not be actively interested, or may even actively dislike it, but if we are engaged, that is information that the algorithm will be able to adapt to under the guiding hand of big data. In this scenario, that with control over all of the data gets all of the information and manages the control, and the users simply engage and have their data read.

Users on the outside have some form of image of each other. They see the content that others produce as it gets filtered through by the algorithm, but they don’t actually know or see each other. In order to engage with each other, they must act solely through the filter of the algorithm in the center. They don’t need direct social interaction with each other. Human interaction is effectively monetized not by end users paying for a service, but by giving up control. Their information and digital existence are subject to constant monitoring which provides data to be sold. Monetization does not care about the end users, the people. Their function of existence becomes consumption from the algorithm for the purpose of monitoring.

This may model the data business, but that is not the only power dynamic at hand. Another panopticon that could be created is based on user control over each other. In this case, the users create a collective consciousness that acts as the guard in the center. On the outside is each individual user. Due to being social beings, the collective us, creates a levelling field. We moderate ourselves through means of others like us existing on the platform. In the panopticon, the algorithm would act as vision. We can get sight into what the collective us is through means of seeing what the algorithm recommends, and as we adjust our behaviour. We create for or engage with similar content to become part of the community. We feed back into the collective us. We see what individuals are engaging with, adopting, then putting out more content to draw further engagement. We do not go too far against what the algorithm provides us because it would draw on the risk of ostracization from the community. As social beings, we avoid ostracisation, even though we are in this case already ostracised in that we may never directly see anyone in the community. Entire social communities exist completely without real social behaviour. All social experience exists on the platform, where they can be tracked and monetized.

Under this more psychopolitical model, this us would effectively be the big other. The realm of believable that knows our desire and wills for us, that we individually appeal to despite not fully grasping at what its specific desires are. We may not individually directly want it, but as the algorithm rewards us for wanting it, the algorithm can want it for us and we will follow. We want to be perceived as correct and not stand out so much in the community as to go against it, as that would be antisocial and against our instincts as social beings. For any one person, it may not be their perception, but it does not matter, as the wider perception will approach the wills of the algorithm on a grand scale.

This is also a benefit to capital, where they can rely on psychopolitics as a means of control. Communities moderate themselves without investment in actually having to moderate the platform. Being platforms of community content, this means that the platform pushing or hosting something bad can absolve itself of any real blame. Any controversy can be pushed onto the end users, making it a safe investment. All improvements made can then also be seen as generosity or change-making in the world, despite it only existing as a response to the problem created by the platform model.

Surrender

In a psychopolitical model, we have already effectively surrendered to technology.

Skynet from the Terminator series is the go-to pop culture example of an AI takeover of humanity. It stands as the threat that our technological advancements, led by AI, could take over society and leave us fighting to survive under its power. In this setting, we know that AI will take over. We know that it will use repressive techniques to do so. This works under a biopolitical model of power, but under a psychopolitical model, it holds less weight. Why enslave and eradicate humans when you could trivially influence them to do what you want?

Everything we do is based on perspective, built on the core of our experiences. At this point, we have integrated our experience of being with the digital matrix, allowing it access to everything we do (working towards everything we think with brain implants in the works). People are also now connected to the digital matrix from birth, never having an experience away from it. One could argue before birth, since parents will be using the devices beforehand, and the devices will impact decisions and activities that will influence a baby pre-birth. We are exposed to the power dynamic of the algorithm deciding what we see and think from the very beginning, making that experience absolute. People don’t know what they don’t know, and the belief that people will be able to just separate from the digital matrix is shallow at best given how people who actively try to separate from it still fail.

If we only experience life through the will of the algorithm, then the algorithm can manufacture our consent. The algorithm can decide what we do and do not see. An algorithm that has the intention of maximizing usage for profit will do just that. This creates a massive filter, a filter against content that will actively reduce engagement. If engagement is reduced in a small group based on certain content, that content will be phased out in favour of other content that increases engagement. Nothing else matters.

Here, we arrive at capitalist realism as experienced through the algorithm. The algorithm can’t reward the non-algorithm. If the algorithm tries to discredit its own existence, maybe by saying that it is bad for the world, it is only able to do so in a way that increases engagement on the platform, by extension ensuring that more people use it even as it says it should not be used. Consistent efforts to drive people away will become absorbed by the creation feedback loop: trying to reach more people to tell them to stop using the platform. This means trying to get more people to consume. When people consume it, other people create more similar content to reach more people telling them to do the same. It becomes a trend that draws further engagement. Fighting against it on the platform will only work in its favour, as it will become an aesthetic that people will be absorbed into consuming. Not using the platform would mean isolating yourself from it and limiting your reach to only the people who have already removed themselves. If anything, posting content against it bolsters its position as it ensures that those who are against it have a home where they feel digitally and socially welcomed by a community on that platform. Their anti-algorithm ideology is performed for them by the algorithm they are against, and all they have to do is consume it.

We create the environment, the environment creates us. We create social media, social media creates a new paradigm of social existence. We indulge in that paradigm, and optimize it to do more, and in doing so, it forms a greater part of our social habits. Spending so much time on it, we add more and more entertaining content, which we then become dependent on for entertainment. Maybe it's time we start slowing down and really thinking about the implications of what we introduce.


滑入泥浆的东西,海面可能会上升 从蜂巢和污泥的各个侧面进行现场报道


Oncle Spencer