CultureWaves Trends

culturewaves-300
Problem:

A world of aggregation and social media has conditioned us to accept crowd-sourced opinion as truth, and a blog as fact. We’ve learned to search, but not to discriminate. We’ve learned to ingest information, but our ability to research and push something viable back out has gotten complicated, as we now have to wade through masses of data that simply isn’t relevant. No wonder we take the easy way out and accept the voice of the masses.

In a Millennial world, acceptance of what they are fed has become the norm. After all, the Internet and the advent of search opened a whole new world and perhaps it was enough to take it all in for a decade before beginning to sort it back out. However, we are moving into the age of Generation C, the connected generation (sometimes called Gen Z, following Gen X and Gen Y, which are both Millennial designations).

Twitter, Instagram, and Facebook fundamentally changed the way the Millennials ingest information. Instead of actively researching a political candidate or understanding the implication of the latest trend, they rely on their feeds and push data. Information is blasted into their faces 24/7, from the moment they pick up their phone or log onto the Internet.

In a sharing economy, people believe what they read from their hand-picked trusted sources, and allow their opinions to be formed on the basis of tweets, Facebook posts, and listicles rather than their own independent research.

In a Gen C world, however, a change is happening. To them, digital life is inherent. They haven’t had to adapt to this new world—they were swiping devices from the crib. What it has now led to, as this generation enters adulthood, is a re-evaluation of how to put all the pieces of information together. What do you do with a search that nets you six million points of data? How do you sort out your own perspectives, make your own applications, set your own path?

Part of what is driving the change is a realization that jobs are changing in a digital economy. Kiosks replace customer service; automation replaces repetition. It’s enough to make you believe that machines will replace people. Evidence, in fact, shows where it is happening. All we have to do is look at the automation currently being incorporated into fast food. One of the latest looking at an employee-free restaurant, as reported by Tech Insider, is the CEO of Carl’s, Jr. and Hardee’s. He is quoted as saying of machines, “They’re always polite, they always upsell, they never take a vacation, they never show up late, there’s never a slip-and-fall, or an age, sex, or race discrimination case.”

While machines don’t, at this point, handle anything other than straight forward, programmable functions, who’s to say that they won’t soon do more of our thinking for us? After all, it’s what we’ve been demanding of intuitive devices, isn’t it?

The key question, then, is: Has technology caused us to stop doing our own thinking?

Oh sure, we think about what to eat, what to wear, even how to search for the right answer. But true critical thinking, in which we objectively look at all sides of an issue and form our own answers, our own judgment, our own approach—well, that’s a lot of work when we can just ask the collective crowd.

We crowd source everything from who to vote for to how to find a doctor. While technology isn’t the solution, it’s become the source, the enabling font of all knowledge.

In the process, technology has begun teaching us that jobs are for machines. Machines, after all, are consistent. They are more reliable, more precise, and can do detailed work without needing a break. Sure, we create the machines, we maintain them, but at some point we have to ask ourselves: who’s answering to whom?

Tech right now is teaching us the most important thing we can learn: that jobs are for machines, and if we don’t know how to think, the jobs won’t go to people.

They’ll go to machines.

If we don’t know how to think, then what happens to our economy over time? In other words, if we aren’t consistently creating the machines, providing the answers, coming up with new ideas, are we replaceable?

The idea isn’t to condemn the machines—that’s a lifestyle that is already in motion. We are progressing, not moving backward. But we need to recognize what this transition is doing to us as an economy.

We are no longer the ones doing the work. The machines are. It’s now our job to apply critical thinking to innovation, to our jobs, to our families, to everything that impacts our future.

Our mindset needs to shift as much as our fingers have. We need to move from a group of doers into a group of thinkers and embrace the change.

Enter the new world of Critical Thinking.

Solution:

Generation C (also called Gen Z), the connected generation, knows what’s coming. They’ve grown up with inherent technology—they haven’t really had to learn how to use it. What they are missing, however, is how to use it to discover and apply something new that is relevant to their lives.

So, while Millennials grew up thinking that technology would do their thinking for them, Gen C just may be ready to apply the power of technology to get to a new idea.

The solution is finding a way to get to relevancy by easily discovering how one thing relates to another. Technology can lay out the path, but critical thinking skills are needed to tie it all together.

The simple concept of search is what fueled a two-decade boom of the Internet, enabling consumers to look beyond what they already know and potentially get somewhere interesting. The trouble has been, though, that they had to know what they were looking for.

Search has existed in a bubble that requires the consumers to know what they want to find, and once they begin the hunt they could then see relevant information along the way.

However, as the age of search reeled onward, it became convoluted. Result sets went from hundreds and thousands to millions—relevant or not to what you were truly looking for.

As search became more convoluted, another path of the Internet began to open: the rise of social media. Initially, the concept of social sharing was viewed as a fun gimmick—one that once only existed as sharing status updates across messenger platforms in the 90’s. Now, with the world as an audience, the consumer had the ability to not only leverage the Internet for search, but also to voice their opinion.

As social media grew, so did the audience base surrounding it—and how they were leveraging the new tools at their disposal. Social media created four distinct consumer mindsets: the thinker, the maker, the watcher, and the repeater.

• The thinker existed as an idea hub, leveraging social media as a platform to get their ideas into the world before someone beat them to it.
• The makers leverage social media as an outlet to post their creativity and their work.
• Watchers engage heavily in social media, but do not necessarily contribute on a significant scale beyond posting comments.
• Repeaters simply reiterate the thinking, making, and watching that they find personally interesting.

As this cycle grew, social media became a vast wealth of information—one that needed again to leverage the power of search. As social platforms grew and became niche-focused hubs, consumers began to leverage different outlets in different ways. Twitter became a thought outlet, Pinterest became a maker’s hub, Tumblr a watcher’s collective—and all of them a magnate for repeater culture.

By the time we were five years into the social media golden age, YouTube had become a content platform, Pinterest was being leveraged as an idea center for major brands and corporations, and Twitter had released Vine as a new engagement platform—all of which was being spread and shared across Facebook. We had hit the content boom, where brands couldn’t just post a tweet and get engagement, they had to put effort into their social platforms and create unique content that consumers could leverage.

Now that both brands and consumers were heavily invested in content creation there was a new problem. Brands were copying consumer ideas, consumers were copying the ideas of other consumers, and content across the web began to feel homogenized. With everyone re-tweeting, re-posting, re-gramming, and re-pinning content, it began to feel less relevant to the consumer, and began to feel much like that once booming platform of the now overwhelming search engine.

Content was king—but did the audience care?

As social media became the new central hub of the Internet, content was everywhere, but it wasn’t necessarily engaging. In order to truly engage with the audience, content was missing a level of true relevancy.

Social media had the ability to show us what was both trendy and timely, but it didn’t give any insight into why. As content was created based off of trends and timely products, it didn’t mean that it was what the audience wanted—in some cases the consumer was increasingly unknowing of what they wanted.

With everything available at their fingertips, the socially-driven Internet culture could find their interests in a split second, leveraging both search and social media. But it did nothing to help them find what they could potentially be interested in. Internet culture had taken the thinking out of finding something interesting, you simply had to look up what others were doing and replicate or adapt it to what you wanted. This dramatically changed the balance of behind the thinker, maker, watcher, and repeater cycle; creating more repeaters and watchers and fewer original thinkers and makers.

This has led us to questioning, “what’s next?” with the Internet. Search had become oversaturated to the point it was overwhelming; consumers were looking less for new ideas and more for new ways to leverage old ideas.

This is when the discovery mindset began to emerge.

Consumers began to leverage music and entertainment services that could track their interests, and then begin to suggest new ones to them. This process was leveraging what was relevant to the consumer, and suggesting new potentially relevant content to them. This allowed the consumer to begin moving beyond search, where they needed a starting point; and into discovery, where they were suggested starting points.

The social hierarchy of the Internet was beginning to be disrupted, as thinkers, makers, watchers, and repeaters now had the ability to discover new interests and content based on what they already liked. It leveled the playing field across the four types of users and one no longer depended on the other in a cycle.

Discovery is allowing consumers to find the “what” of the Internet. Consumers have been given a launching off point where they can start to explore new ideas, and now they have to begin processing—critically thinking through—whether or not the ideas are worthwhile.

Critical thinking has been something absent of Internet culture since the advent of search and social media, where the general consumer mindset has been led to what they’re interested in either by a person, product or brand. Now that the consumer is getting the ability to discover things for themselves, they have to learn how to think and analyze them. After years of social listicles telling them the top 20 of one thing and the best way to do something else, applicable critical thinking is a new frontier for digesting online media.

The biggest challenge as this process moves forward is for the individual consumer to construct their own ideas, and find value in them—instead of relying on likes and an audience to tell them its good.

Internally we’ve spent ten years gathering information based on consumer behavior and relevancy, compiling it into a system that, when leveraged, can help build ideas and original thinking. We’ve used the core concepts of learning how to critically think as we have trained our own team members. This process, which takes approximately three months, enables a person to look at “what” is going on and decipher “why” it’s happening.

As our culture moves beyond looking at the “what” the “why” continues to gain momentum and a new level of relevancy. Millennials grew up alongside of the rise of Internet culture and the value of “what” culture. They are going into the Discovery web without a clear process for how to critically think. And at the same time as we continue to place importance into “why,” we can begin to see the value it will have on Generation C.

Instead of making evaluations based on likes, reviews and responses, Generation C has the opportunity to be taught to think about why something matters versus what it is. This could fundamentally impact the groundwork of this generation as they grow up, giving them the opportunity to be critical thinkers instead of critical listeners.

Critical thinking is what will move our economy forward. Do we have a chance? Yes, because the members of Gen C at least seem to realize they have been trained by social media to believe that popularity, likes, views, and thumbs up are the same thing as validity and relevancy. We have to turn that kind of thinking on its head, and pull people back into real critical thinking.
Our concept of a social Internet will continue to change and evolve, and along with it so will the platforms of engagement.

We have a rare opportunity to get ahead of the next wave of online engagement, and make it not only smarter and more relevant, but also more impactful.

Social media taught us how to find “the dots,” and discovery can teach us how to connect them.

Comments are closed.