In the ever-evolving landscape of digital marketing, the question “Is SEO dead?” has surfaced with increasing frequency. The short answer is no, but the rules of the game have changed significantly, largely due to the advent of artificial intelligence (AI). The rise of AI has reshaped search results, diminishing the importance of traditional SEO tactics like backlinks and keyword stuffing, and emphasizing the need for high-quality, user-centric content.
The Impact of Artificial Intelligence on Search Results
Artificial intelligence, particularly in the form of machine learning algorithms, has revolutionized how search engines evaluate and rank web content. Google’s AI systems, such as RankBrain and BERT, are designed to better understand the context and intent behind user queries. This shift means that search engines are now more adept at discerning the relevance and quality of content, rather than relying solely on the presence of keywords or the number of backlinks.
AI Summaries and User Intent
One of the most significant changes brought about by AI is the generation of AI summaries in search results. These summaries, often found in “featured snippets” or answer boxes, provide users with direct answers to their queries without requiring them to click through to a website. This development prioritizes content that is clear, concise, and directly answers the user’s question. Consequently, websites must focus on providing value and directly addressing user needs to remain competitive in search rankings.
The Declining Importance of Backlinks
Backlinks have long been a cornerstone of SEO strategy, serving as endorsements of a website’s authority and relevance. However, their influence is waning in the face of AI advancements. While backlinks still play a role in SEO, search engines are increasingly capable of evaluating content quality and relevance independently of external endorsements. This shift reduces the efficacy of tactics that focus primarily on acquiring backlinks and underscores the importance of producing substantive, high-quality content.
Content Overload: A Misguided SEO Tactic
In an attempt to boost SEO rankings and increase engagement time, many content creators adopted the tactic of adding extensive background information, tips, and personal stories to their webpages. The idea is that more content equates to greater relevance and higher rankings. This approach is particularly prevalent in recipe websites, where users often find themselves scrolling through paragraphs of unrelated content before reaching the actual recipe.
While this strategy can increase keyword density and on-page time, it often makes webpages less beneficial to end users. Overloaded pages can frustrate users, leading to higher bounce rates and ultimately harming the site’s SEO performance. Google’s recent updates aim to curb this practice by prioritizing content that directly answers user queries and provides a better user experience.
Google’s Crackdown on Low-Quality Content
In response to the proliferation of low-quality, undifferentiated niche sites designed to game the SEO system, Google has implemented measures to close loopholes that previously allowed such sites to flourish. These updates target content farms and low-effort websites that prioritize quantity over quality. Google’s algorithm now places greater emphasis on unique, well-researched, and valuable content, effectively reducing the visibility and profitability of low-quality sites.
The Rise of Chatbots and Their Impact on Search Engines
As of April 2024 Google still holds the dominant search engine position with a market share around 90.91% according to Statcounter [Search Engine Market Share Worldwide | Statcounter Global Stats]. However, as AI continues to evolve, the rise of chatbots represents a significant shift in how users interact with search engines. Chatbots, powered by advanced natural language processing, can provide immediate, conversational responses to user queries. This development reduces the need for users to navigate through multiple webpages to find information, potentially decreasing website traffic from traditional search engines.
Chatbots offer a more streamlined and efficient way for users to obtain information, which means that websites need to adapt by ensuring their content is optimized for these AI-driven tools. Providing clear, concise, and structured information will become increasingly important as chatbots become a more prevalent means of accessing information.
The Popularity of Specialized Search Websites
The growing popularity of specialized search websites is reshaping the landscape of online search, posing significant competition to general web search engines like Google. Platforms such as Zillow.com for real estate, Cars.com for automobiles, Kayak.com for travel, Indeed.com for job listings, and Amazon.com for online shopping offer highly tailored search experiences that cater to specific user needs. These specialized search engines provide detailed, industry-specific information and advanced filtering options that general search engines struggle to match. By focusing on niche markets, these sites deliver more relevant results and a superior user experience, driving users to bypass traditional search engines in favor of platforms that offer precise, domain-specific search capabilities.
Conclusion
SEO is not dead, but it is undergoing a profound transformation driven by artificial intelligence. Traditional tactics like backlink building and keyword stuffing are losing ground to strategies that prioritize content quality and user experience. AI’s ability to understand user intent and generate concise summaries is reshaping search results, while Google’s crackdown on low-quality content underscores the need for authenticity and value.
As chatbots and AI continue to evolve, content creators must adapt by focusing on delivering high-quality, relevant content that meets user needs. In this new era of SEO, the mantra “content is king” holds truer than ever, but with a renewed emphasis on quality, relevance, and user satisfaction.
Recently, I completed a project that required handling a data source with an inconsistent structure and non-standardized data (commonly referred to as dirty data). Each record contained over 400 fields, but the order of these fields varied unpredictably from one record to the next. The data also suffered from inconsistencies within the fields themselves. For example, some records used abbreviations, while others spelled out terms in full. To complicate things further, the data was accessed through a RESTful API (Representational State Transfer).
The Challenge
Dynamically importing this data directly from the REST API into the target application proved to be problematic. The import script would identify malformed records and skip them entirely, resulting in data loss. While the script was resilient in that it could continue functioning despite encountering errors, it was not adaptive. It lacked the ability to intelligently handle the varying structure of the source data.
In simpler terms: the source data was a mess, and I needed to develop a solution that could intelligently manage it.
The Solution: A Staged ETL Approach
To resolve this issue, I applied a staged approach using the ETL process (Extract, Transform, Load), a common method for dealing with problematic data. Here’s how the ETL process works:
Extract: Data is pulled from one or more sources (such as databases, files, or APIs) and stored in a temporary location.
Transform (also known as “Data Scrubbing/Cleaning”): The extracted data is analyzed, cleansed, and standardized. This step resolves inconsistencies and errors, transforming the data into the desired structure for the target system.
Load: The cleaned and standardized data is then imported into the target system, such as a database or application, for end-user access.
For this project, I implemented a data-adaptive approach, which not only ensured resilience but also allowed the software to intelligently handle and cleanse the dirty source data.
Implementing the Data-Adaptive Approach
The concept is straightforward. First, use the API to retrieve the data records and store them in a temporary intermediary file, without attempting any corrections or cleansing at this stage. This essentially dumps the data into a location where it can be processed using a programming language and tools of choice.
During the Transform phase, the software analyzes each row of data to determine the location of each required data field. In simple terms, this step “finds” the relevant data in each record, even when the structure is inconsistent.
Once the necessary data fields are identified and their locations known, the software can iterate through each row, applying logic to cleanse and standardize the data. Afterward, the cleaned data is written into a new, properly structured file that is consistent and ready for import into the target system.
Enhanced Transformation Logic
During the transformation process, I incorporated some additional features. Based on the presence or absence of certain data in each record, the software dynamically generated new data fields that might have been missing from the source. This approach allowed the system to compensate for incomplete records, further improving data integrity.
Pseudocode for the Solution
Here’s a simplified version of the process in pseudocode:
// Step 1: Retrieve data records from the source system
sourceData = retrieveDataFromSource()
// Step 2: Create a map of required data fields and identifiers
fieldMap = createFieldMap([
{fieldName: "Field1", identifier: "SourceField1"},
{fieldName: "Field2", identifier: "SourceField2"},
// Additional field mappings as needed
])
// Step 3: Initialize an array to store cleansed data
cleansedData = []
// Step 4: Loop through each row in the source data
for each row in sourceData:
// Step 5: Analyze the row using the map to identify required data fields
requiredFields = []
for each field in fieldMap:
requiredFields.append(findField(row, field.identifier))
// Step 6: Cleanse and standardize each required data field
cleansedRow = []
for each field in requiredFields:
cleansedRow.append(cleanseAndStandardize(field))
// Step 7 (Bonus): Dynamically add new fields based on business logic
if businessLogicConditionMet(row):
cleansedRow.append(createAdditionalField())
// Step 8: Store the cleansed row in the output file
cleansedData.append(cleansedRow)
// Step 9: Save cleansed data to the target platform
saveToTargetPlatform(cleansedData)
Explanation:
Step 1: Retrieve the dataset from the source.
Step 2: Map the required fields and their attributes to locate them in the source data.
Step 3: Initialize an array to store the cleansed data.
Step 4: Loop through each row of source data.
Step 5: Identify the required data fields in the current row using the field map.
Step 6: Cleanse and standardize each identified field.
Step 7 (Bonus): Add extra fields based on business logic, dynamically creating new fields if needed.
Step 8: Store the cleansed row of data in the output array.
Step 9: Once all rows are processed, save the cleansed data to the target platform for further use.
Conclusion
By employing a data-adaptive approach, I was able to successfully manage a problematic data source with inconsistent structure and content. This solution not only made the system resilient to errors but also capable of dynamically correcting and adapting to the data it processed. The staged ETL approach, with enhancements during the transformation phase, ensured that the data was accurately cleansed and properly structured for importing into the target application.
In the rapidly evolving digital landscape, businesses are navigating a complex terrain to engage users across various touchpoints. Digital Experience Platforms (DXP) have emerged as a core strategy for organizations seeking to optimize their digital interactions, ensuring seamless and immersive experiences for users. A DXP is an integrated suite of technologies that provide a comprehensive solution for managing the entire digital experience lifecycle.
84% of customers say that being treated like a person is key to winning their business
The No. 1 reason customers switch brands is because they feel unappreciated
The technology that organizations use to establish and enhance customer experience (CX) continues to evolve. Digital experience platforms evolved from web content management system (CMS) to help marketers meet the needs of customer expectation. Digital eXperience Platforms (DXPs) now serve as the backbone for customer experience allowing brands to offer the personalized and relevant digital experiences customers expect today.
Key Capabilities of Digital Experience Platforms
Content Management
DXPs offer robust content management systems (CMS) to create, edit, and publish content across different digital channels.
Personalization tools enable dynamic content based on user preferences, behavior, and demographics.
Multi-Channel Delivery
DXPs facilitate consistent delivery of content across diverse channels, including websites, mobile apps, social media, and more.
Responsive design ensures optimal user experiences on various devices.
User Analytics
Advanced analytics tools track user behavior, engagement patterns, and content performance.
Insights gathered help businesses make data-driven decisions to enhance user satisfaction.
Commerce Integration
Seamless integration with e-commerce platforms allows businesses to offer a unified and efficient buying experience.
Features like product recommendations and personalized shopping carts enhance user engagement.
Marketing Automation
DXPs provide robust marketing automation tools for targeted campaigns, email marketing, and lead nurturing.
Automation streamlines marketing efforts, fostering better user engagement and conversion rates.
Social Media Integration
Social media features enable content sharing, user-generated content integration, and social listening.
DXPs leverage social channels for brand promotion and community building.
Customer Relationship Management (CRM)
Integration with CRM systems enables businesses to manage customer relationships effectively.
Unified customer data allows for personalized interactions and improved customer service.
Search and Navigation
Advanced search capabilities and intuitive navigation enhance user experience.
AI-powered search algorithms provide relevant and quick results.
Popular Digital Experience Platform Providers
Adobe Experience Cloud
Adobe Experience Cloud and Adobe Experience Manager (AEM) are products from Adobe that help businesses create and manage digital experiences. Adobe Experience Cloud is a suite of cloud-based marketing and analytics solutions that help businesses create, manage, and measure their digital experiences. It includes a wide range of products, such as Adobe Target, Adobe Analytics, and Adobe Campaign. Adobe Experience Cloud is designed to help businesses understand their customers, personalize their marketing campaigns, and measure the results of their efforts.
Sitecore
Sitecore Digital Experience Platform (DXP) is a cloud-native platform that helps organizations create, manage, and deliver personalized digital experiences across multiple channels. It provides a comprehensive set of tools for content management, marketing automation, commerce, and analytics.
Salesforce Experience Cloud
Salesforce Experience Cloud is a suite of cloud-based digital experience (DX) solutions that helps businesses create, manage, and deliver personalized experiences across multiple channels, including websites, mobile apps, portals, and communities. It provides a comprehensive set of tools for customer experience (CX), marketing, commerce, and analytics.
Oracle Cloud CX
Oracle Cloud CX, formerly known as Oracle Advertising and Customer Experience (CX), is a cloud-based suite of applications that help businesses create, manage, and deliver personalized customer experiences across multiple channels. It includes a range of products, such as Oracle Marketing Cloud, Oracle Sales Cloud, Oracle Service Cloud, and Oracle Commerce Cloud.
Acquia
Acquia Digital Experience Platform (DXP) is a cloud-based platform that helps businesses create, manage, and deliver personalized digital experiences across multiple channels. It is a comprehensive suite of tools that includes content management, marketing automation, commerce, and analytics. The core Acquia Drupal Cloud platform is based on the open-source Drupal CMS, but the Acquia DXP also includes a number of proprietary modules and extensions that are not open source.
Liferay
The Liferay DXP is a powerful and flexible platform that can be used to build a wide variety of digital experiences, including websites, portals, intranets, and mobile apps. It is a popular choice for enterprises that need a scalable and extensible platform that can support their digital transformation initiatives. The core Liferay Portal platform is available under the LGPL v3 license, which is an open-source license that allows for free modification and distribution of the software. Liferay DXP modules and many of the Liferay DXP extensions are proprietary.
What is a composable DXP?
A composable DXP, or Digital Experience Platform, is a modular approach to building digital experiences that utilizes best-of-breed components from various sources. It contrasts with the traditional monolithic DXP, which provides a fixed set of features and functionality.
A composable DXP offers several advantages, including:
Flexibility: Composable DXPs allow organizations to select and combine components that align with their specific needs and requirements. This flexibility enables them to create tailored digital experiences that match their unique brand and user personas.
Agility: Composable DXPs facilitate faster development and deployment of digital experiences. By leveraging pre-built components, organizations can quickly assemble and adapt their digital platforms to meet evolving market trends and customer demands.
Scalability: Composable DXPs are inherently scalable, allowing organizations to seamlessly add or remove components as their needs change. This scalability ensures that their digital platforms can grow alongside their business.
Cost-effectiveness: Composable DXPs can potentially reduce costs by eliminating the need for expensive custom development and maintenance of a monolithic platform. Organizations can instead focus on selecting and integrating components that provide the best value for their money.
Conclusion
In the era of digital transformation, Digital Experience Platforms play a pivotal role in orchestrating cohesive and engaging user experiences. By integrating various technologies into a unified solution, DXPs empower businesses to adapt to changing user expectations, foster brand loyalty, and drive engagement and growth in the digital realm. The selection of a DXP should align with the specific needs and goals of each business, ensuring a tailored and effective digital strategy.
In recent years, a concept known as the “Dead Internet Theory” has emerged, suggesting that the internet as we know it is slowly dying or becoming obsolete. This theory has sparked debates and raised questions about the future of the digital age. In this article, we will explore the Dead Internet Theory, its origins, its arguments, and the reality behind this provocative concept.
Understanding the Dead Internet Theory:
The Dead Internet Theory posits that the internet is losing its original spirit of openness, freedom, and decentralization. Proponents argue that increasing corporate control, government surveillance, censorship, and the dominance of a few powerful entities are stifling innovation and transforming the internet into a highly regulated and centralized platform. They claim that the internet is losing its vitality and becoming a mere reflection of offline power structures.
Origins and Key Arguments:
The origins of the Dead Internet Theory can be traced back to concerns raised by early internet pioneers and activists who championed a free and open internet. They feared that the commercialization and consolidation of online platforms would undermine the principles that made the internet a transformative force.
Among the concerns put forth by the Dead Internet Theory are fake traffic from bots, and fake user accounts. According to numerous sources, a significant percent of Internet traffic is from bots. Estimates range from a low of 42% of traffic, to over 66% of traffic is from bots. Bots are automated software programs that perform various tasks, ranging from simple to complex, on the internet. They can be beneficial, such as search engine crawlers or chatbots, or malicious, like spam bots or distributed denial-of-service (DDoS) attack bots.
The problems associated with bot traffic arise primarily from malicious bots that engage in fraudulent activities, spamming, data scraping, click fraud, credential stuffing, and more. These activities can have severe consequences, including financial losses, compromised security, reputational damage, and disruptions to legitimate online services.
Social media platforms have been grappling with the challenge of fake user accounts for some time. These accounts are created for various purposes, including spreading misinformation, engaging in spam or fraudulent activities, manipulating public opinion, or conducting malicious campaigns. Increasingly, bots use these fake accounts “push” a narrative. This activity is very noticeable during political elections.
Proponents of The Dead Internet Theory highlight several key arguments:
Centralization and Monopolistic Power: They argue that a small number of tech giants now dominate the internet, controlling vast amounts of data and shaping user experiences. This concentration of power limits competition and stifles smaller players’ ability to innovate.
Surveillance and Privacy Concerns: With the rise of surveillance technologies and data breaches, privacy advocates express worry that individuals’ online activities are constantly monitored and exploited for various purposes, eroding trust in the internet.
Censorship and Content Control: The theory also highlights instances of government-imposed censorship, content moderation challenges, and algorithmic biases, suggesting that freedom of expression is under threat.
Net Neutrality and Access: Advocates argue that the internet’s openness is compromised by practices that prioritize certain types of traffic or restrict access based on geographic location or socioeconomic factors, leading to a digital divide.
The Reality:
While the concerns raised by the Dead Internet Theory hold some validity, it is important to approach the subject with nuance. The internet remains a dynamic and evolving medium, shaped by technological advancements and societal changes. While challenges exist, numerous initiatives and movements aim to preserve the internet’s founding principles.
Efforts such as decentralized technologies (like blockchain), open-source software, encryption tools, and net neutrality advocacy strive to counteract centralization, surveillance, and censorship. Additionally, the proliferation of alternative platforms, social networks, and online communities ensures that diverse voices and opinions can still find a space online.
Final Thoughts:
The Dead Internet Theory serves as a reminder of the ongoing struggle to maintain an open, free, and decentralized internet. While concerns over centralization, surveillance, and censorship are valid, the internet is not irreversibly “dead.” It continues to evolve, driven by the collective actions of individuals, organizations, and policymakers. It is important that the internet remains a powerful tool for connectivity, knowledge-sharing, and empowerment.
The number of people using conversational AI tools, such as voice assistants and chatbots, is rapidly growing and redefining the relationship of users with technology. Voice technology has become the most disruptive force to hit the world since the internet became a visual medium. Voice assistants and chatbots have become an additional interface for marketing purposes. It brings an entirely new way of interacting with customers and adds a higher value to their experience.
123.5 million US adults will use voice assistants at least once per month in 2022. That number is expected to grow to approximately 50% of all US adults in the next 3 years.
Some of the most well-known and widely used voice-activated assistants include:
Alexa, developed by Amazon
Google Assistant, developed by Google
Siri, developed by Apple
Cortana, developed by Microsoft
Bixby, developed by Samsung
ChatGPT is a variant of the GPT (Generative Pre-training Transformer) language model, which was developed by OpenAI. It is a machine learning model that is designed specifically for chatbot applications and is able to generate natural-sounding responses to a wide variety of inputs. The goal of ChatGPT is to make AI systems more natural and safe to interact with. The services ammassed millions of users in less than 1 month.
While ChatGPT is not a traditional voice-activated assistant like Alexa or Siri, it is capable of generating text-based responses to user inputs and can be used in a chatbot application that allows users to communicate with it using natural language. It is not designed to be used with voice input, but rather is intended to generate text responses that can be displayed on a screen or presented to the user in some other way.
ChatGPT surpassed over 1 million users in just 5 days. To put that in perspective it took Netflix over 41 months, Facebook 10 months, and Instagram 2.5 months to achieve the over 1 million users.
Google’s management has reportedly issued a ‘code red’ amid the rising popularity of the ChatGPT AI. The move comes as talks abound over whether ChatGPT could one day replace Google’s search engine.
Sridhar Ramaswamy, who oversaw Google’s ad team between 2013 and 2018, said that ChatGPT could prevent users from clicking on Google links with ads, which generated $208 billion in 2021(81% of Alphabet’s overall revenue).
Google has a similar technology called LAMBDA. LAMDA is a machine learning model developed by Google for natural language processing tasks. It is a variant of the Transformer model, which is a type of neural network architecture that is particularly well-suited to processing and generating large amounts of text data.
LAMBDA is an acronym that stands for “Language Model-Agnostic Meta-Learning.” It is designed to be able to quickly adapt to new tasks and languages by learning from a small amount of labeled data, and it has been shown to be effective at a wide range of natural language processing tasks, including language translation, language modeling, and text classification.
Google LAMBDA is part of the Google AI research group and is used in a variety of applications, including machine translation, language understanding, and chatbot development. It is a powerful and flexible machine learning model that has the potential to significantly advance the state of the art in natural language processing.
Google is hesitant to release its AI chatbot LAMDA to the public in its current state over concerns over “reputational risk” due to its high margin of error and vulnerability to toxicity. Like most technologies, this tech can be abused.
Integrating voice assistanta and chatbots into your marketing strategy isn’t easy.
Your content marketing and editorial strategies should reflect how your business plans to leverage the technology and how invested you are in it from a content point of view. For most businesses voice and chatbot marketing should start with “search”. 2022 saw over one billion voice searches conducted in a month. Content marketers must emphasize short-form content products that offer quick and crisp answers to users to engage consumers.
SEO techniques can be used for voice activated search results becuase search results from Siri, Cortana, and others frequently use Google’s featured snippets. When focusing on voice-search optimization, it is essential to remember that the virtual assistant can only deliver a single search result per request. Marketers should adopt SEO guidelines related to spoken word search behaviors and informational needs.
Please remember that voice assistants and chatbots will soon become an additional interface for marketing purposes. It brings an entirely new way of interacting with customers and adds a higher value to their experience.
For too long a few large corporations(aka: Big Tech) have dominated the Internet. In the process they have taken control away from users. Web 3.0 could put an end to that, with many additional benefits.
The Big Tech companies take user’s personal data, track their activity across websites, and use that information for profit. If these large corporations disagree with what a user posts, they can censor them even if the user posts accurate information. In many cases, users are permanently blocked(deplatformed) from using a website or platform. These large corporations use creative marketing strategies, aggressive business deals, and lobbyists to consolidate their power and promote proprietary technology at the expense of Open Source solutions and data privacy.
If you care about regaining ownership of your personal data, you should be excited about Web 3.0. If you want an Internet that provides equal benefit to all users, you should be excited about Web 3.0. If you want the benefits of decentralization, you should be excited about Web 3.0.
To understand Web 3.0, it is beneficial to have a brief history of the Internet.
Web 1.0 (1989 – 2005)
Web 1.0 was the first stage of the internet. It is also frequently called the Static Web. There were a small number of content creators and most internet users were consumers. It was characterized by few people creating content and more people on the internet consuming content. It was built on decentralized and community-governed protocols. Web 1.0 was mostly static pages that had limited functionality and were commonly hosted by Internet Service Providers(ISP).
Web 2.0 (2005 – 2022)
Web 2.0 was also called the Dynamic Internet and the Participative Social Web. It was primarily driven by three core areas of innovation: Mobile, Social, and Cloud Computing. The introduction of the iPhone in 2007 brought about the mobile and drastically broadened both the user-base and the usage of the Internet.
The Social Internet was introduced by websites like Friendster(2003), MySpace(2003) and Facebook(2004). These social networks coaxed users into content generation, including sharing photos, recommendations, and referrals.
Cloud Computing commoditised the production and maintenance of internet servers, web pages, and web applications. It introduced concepts like Virtual Servers, and Software as a Service(SaaS). Amazon Web Services(AWS) launched in 2006 by releasing the Simple Storage Service(S3). Cloud computing is Web-based computing that allows businesses and individuals to consume computing resources such as virtual machines, databases, processing, memory, services, storage, messaging, events, and pay-as-you-go. Cloud Computing continues to grow rapidly driven by advanced technologies such as Artificial Intelligence(AI), Machine Learning(ML), and the continued adoption of cloud based solutions by enterprises.
Unfortunately Web 2.0 also brought about centralization, surveillance, tracking, invasive ads, censorship, deplatforming and the dominance of Big Tech.
Web 3.0(In Progress)
Web 3.0 is still an evolving and broad ranging concept but rapid progress is being made. It was originally thought that the next generation of the Internet would be the Semantic Web. Tim Berners-Lee(known as the inventor of the World Wide Web) coined the term to describe a more intelligent Internet in which machines would process content in a humanlike way and understood information on websites both contextually and conceptually. Some progress was made on helping machines understand concept and context via metadata markup standards like Microformats. Web 3.0 was to also be a return to the original concept of the web, a place where one does not need permission from a central authority to post, there is no central control, and there is no single point of failure.
Those are idealistic goals, but recent technology developments like Blockchain have expanded the idea of what Web 3.0 could represent. The framework for Web 3.0 was expanded to include decentralization, self-governance, artificial intelligence, and token based economics. Web 3.0 includes a leap forward to open, trustless and permissionless networks.
The rise of technologies such as distributed ledgers and storage on blockchain will allow for data to be decentralized and will create a secure and transparent environment. This will hopefully put and end to most of Web 2.0’s centralization, surveillance, and exploitative advertising.
The adoption of Cryptocurrency and Digital Assets are also a major part of Web 3.0. The monetization strategy of Big Tech selling user data was introduced by Web 2.0. Web 3.0 introduces monetization and micropayments using cryptocurrencies. This can be used to reward developers and users of Decentralized Applications(DApp). This ensures the stability and security of a decentralized network. Web 3.0 gives users informed consent when selling their data, and gives the profits back to the user via digital assets and digital currency. Cryptocurrencies also use Decentralized finance(DeFi) solutions to implement cross-border payments more effectively than traditional payment channels. The Metaverse and real time 3D Worlds are also part of what is envisioned for Web 3.0.
If you understand the benefits of decentralization, privacy, and open solutions it’s time to recognize the importance of Web 3.0!
The slow and steady decline(death) in organic Facebook reach began in 2014, when Brian Boland, Facebook’s VP of Advertising Technology (at the time), said Facebook is simply managing more ad content than it used to, and News Feed space is thus more competitive among marketers.
As a result, brands have to spend more on paid advertising to get the same reach they use to enjoy by posting on their Facebook Page. This trend started in 2014 and has steadily declined every year since then. At the time of this article, it would surprising if a business or brand could generate meaningful organic reach from posts on their Facebook page.
Facebook still has the largest user base among all the different social media channels. With over 2 billion users worldwide (and counting), marketers are hoping they’ll find their target audience here. Keep in mind that it is estimated that more than 50% of Facebook user accounts are fake.
However, it is increasingly difficult to reach a target audience on Facebook by just posting to a business’s Facebook Page. This benefits Facebook financially as marketers must turn to boosted posts and advertising which generate revenue for Facebook.
What is Facebook Organic Reach?
Organic Reach on Facebook is simply a measurement of how many people can find you on Facebook for free. It’s much like organic rankings on a search engine, although in the case of Facebook it’s based on aspects like popularity, post frequency, and other contributing factors. Organic reach is the number of users that see a post that has not been promoted or “boosted” using Facebook’s advertising solutions.
Why is Facebook Organic Reach Declining?
According to Mark Zuckerberg, there’s a good reason for the death of Facebook’s Organic Reach:
“Recently we’ve gotten feedback from our community that public content — posts from businesses, brands, and media — is crowding out the personal moments that lead us to connect more with each other.”
Specifically, Zuckerberg wants Facebook to be better geared to curate content that builds meaningful relationships. Despite the substantial drop in Organic Reach beginning in 2014, Facebook and Zuckerberg still think that there’s too much Organic Reach for a Page.
With Organic Reach now below 2% Facebook is nearing the end of the road in terms of being a viable option for marketing your brand or reaching your target market on social media.
With Organic Reach being almost non existent now, engagement on posts on Facebook Pages is also almost non existent.
Real Estate Brokerages have been ripe for disruption for a long time. Until recently, traditional real estate brokerages have been spared some of the enormous disruptions that have redefined other industries, but they are under increasing margin pressure, losing top producing agents, and losing market share.
I have spent most of my career analyzing market disruption, disintermediation, network effects, and flywheel effects. I have studied several different markets as new entrants or technologies took on legacy business practices, and dramatically disrupted the traditional way of doing business.
A few examples of new companies disrupting entrenched legacy business models are:
E-Trade vs. Stockbrokerages
Expedia vs. Travel Agencies
Amazon vs. Retail Stores
UBER vs. Taxi’s
Netflix vs. Blockbuster & Cable Television
AirBNB vs. The Hotel Industry
I have also observed this when new technologies took on legacy technologies. Examples include:
Mobile Phones vs. Land Line Telephones
Open Source Software vs. Closed Source Proprietary Software
Digital Advertising vs. Print Advertising
When possible, I try to take a position in markets that are undergoing some sort of major disruption, as it represents a tremendous financial opportunity if you play it correctly. It should be pointed out that this is rarely a ZERO SUM GAME. Yes, many legacy approaches will go away, but some end up re-inventing themselves so that they can succeed in the new competitive landscape. For example, I know of several really good travel agencies that are thriving by developing deep offerings in specialized market segments like fly fishing trips and adventure travel.
In my local real estate market(Bozeman, Montana) the large national franchised real estate brokerages have been losing market share for several years. (see chart). The lack of consolodation among these national real estate brokerges is an indicator of a market undergoing not only a market transformation, but potentially a major disruption.
The large real estate brokerage franchises are not only losing market share in my local market, they are facing increasing pressure on operating margins as their leverage against real estate agents declines. Residential Real Estate Brokerages are a narrow margin business with lots of competitiion for top talent. As the number of options available to top producing agents increases, these agents are successfully negotiating more favourable commission split arrangements with the brokerage and establishing commission caps that place a limit on the amount of commission they pay the brokerage.
In order to retain top producing agents(the lifeblood of revenue for a real estate brokerage), real estate brokerages are having to offer better commission splits, nicer offices, and invest in expensive Internet technology and marketing solutions. This is negatively impacting margins and profits… and many top producing agents are still leaving to join boutique brokerages or go out on their own as true independant agents.
Real Estate Brokerages are very aware of the problem, and the competitive threats. Gary Keller(founder of Keller Williams) said in 2018 that Real Estate Brokerages have to dramatically re-define themselves in order to survive… and he said they have less than 5 years to figure it out. Keller Williams and Coldwell Banker are both making massive investments in developing new proprietary technology in an attempt to compete effectively in the new landscape of residential real estate.
There are many new brokerage concepts popping up around the country competing for top producing real estate agents. 100% models, profit sharing models, hybrid models that are a mixture of both. There are real estate brokerages like Compass that deploy capital(from lead investor SoftBank) as a competitive strategy to acquire market share and try to position themselves as a technology company. There are also the “iBuyers” like OpenDoor and Offerpad that are attempting to take away many of the “pain points” associated with selling a home. Many top performing agents are leaving traditional real estate brokerages and going out on their own, or joining co-operative type arrangements with other top performing real estate agents. According to NAR only 42% of REALTORS® are affiliated with a brokerage franchise in 2019.
Looking at the stock prices for a couple publicly traded real estate brokerage franchisors reveals how investors feel about their financial viability.
Realogy Holdings Corp. is an American publicly owned real estate and relocation services company. Realogy is the leading global franchisor of some of the most recognized brands in real estate including Sotheby’s International Realty, Coldwell Banker, Corcoran, ERA Real Estate, Century 21, and Better Homes and Gardens Real Estate among others.
Realogy estimates that for all U.S. existing homesale transactions in which a broker was involved, Realogy had approximately 16% market share of transaction dollar volume and approximately 13.5% market share of transactions in 2017.
That is an impressive portfolio of real estate brands. However, investors have punished the stock. In 2013 Realogy(NYSE: RLGY) traded at a high of $53.53 per share valuing the company at over $6 Billion. In 2019 the stock was trading at $4.93 per share valuing the company at less than $600 Million… less than 1/10th its highest valuation.
Re/Max Holdings Inc is another American international real estate company that is publicly traded and operates through a franchise system. From 1999 until 2013 the company held the number one market share in the United States as measured by residential transaction sides.
Re/Max Holdings Inc (NYSE: RMAX) traded at a high of $67.20 per share in 2017 valuing the company at $1.19 Billion. In 2019 the stock traded for $25.67 valuing the company at $457 Million… less than 1/2 what it was at its high.
At the time of this article both of those stocks are trading up from the low prices they hit earlier in 2019. To be determined is if that is a “dead cat bounce.” In finance, a dead cat bounce is a small, brief recovery in the price of a declining stock
Looking at the 2019 Residential Real Estate Franchise Report reveals the Franchise Fees for these well known real estate brokerage franchises range from $25K to $35K with ongoing Royalty Fees of between 5% and 6% of gross commissions, and Monthly Marketing/Advertising Fees of 1%. If these real estate brokerage franchises were delivering tremendous value, they would be able to command higher franchise fees, and their stocks would be trading at much higher valuations.
Technology is currently empowering real estate agents at the expense of the brokerage, but it is also predicted to reduce the importance of both real estate agents and brokerages over time. According to researchers at Oxford University, the potential for artificial intelligence computer algorithms to replace real estate brokers is estimated at 97%. The future of the real estate brokerage is under immense pressure, and is changing fast!
Why Are Traditional Real Estate Brokerages In Decline?
Over the last 15+ years almost everything about residential real estate has gone online. This has changed the consumer buying and selling behavior. Previously a real estate brokerage controlled access to information about properties for sale. Consumers used to think the real estate brokerage with the most signs around town or the most advertisements in the local newspaper was the best option.
The brand of the real estate brokerage an agent was associated with used to matter to the consumer. As a result, the value the brokerage brought to the agent was significant.
Today, consumers do not make decisions based on the brand of the real estate brokerage, they choose the real estate agent they think will do the best job for them. The brand of the real estate brokerage increasingly has little to do with the transaction other than cash the check and keep a percentage of the commission. It is the interaction with the real estate agent and the agent’s expertise that creates consumer satisfaction, not the brand of the real estate brokerage.
The National Association Of REALTORS reported in 2016 that only 2% of consumers chose their agent based on the brokerage brand they are affiliated with.
In an effort to retain top producing agents, many brokerages are still trying to convince agents that the brand of the brokerage is important, and that their brand has significant value with consumers.
The reality is that consumers do not care about the brokerage brand an agent is associated with.
The brokerage used to be the consumer’s first stop. Now the consumer’s first step is to go online, and find an agent they want to work with.
Today, information on homes for sale can easily be accessed by looking at websites like Zillow, Trulia, Redfin, Realtor.com, and hundreds of other property search related websites. Consumers no longer consider a brokerage a significant component in the value chain associated with buying or selling a property.
When it comes to data, the largest and most valuable data or information in the real estate industry is basically off limits to brokerage firms. The most valuable data is client and customer data, and that is owned by the agents and is typically closely guarded by them. Brokerages have access to data about listings, sales, revenues and costs. That information is valuable, but the customer data owned by the agents is much more valuable.
I have spent the last 5 years looking at technology solutions for real estate agents. The best solutions tend to be marketed at individual agents, or teams of agents, not real estate brokerages. When agents purchase these marketing tools and take over their own technology stack, the grip that the brokerage has on the agent gets even weaker.
As a result real estate brokerages are not only increasingly losing their influence over consumers, they are also increasingly losing their leverage with real estate agents… especially top performing real estate agents.
What Does The Real Estate Brokerage Of The Future Look Like?
I try not to make public predictions… preferring to make private investments and get positioned to benefit financially for how I predict markets will evolve.
However, you can look at the way markets have behaved in the past to predict the future. One of the Immutable Laws Of Markets is that over the long term they seek efficiency. This usually leads to disintermediation(the removal of unnecessary intermediaries in the business process). Many of the top performing real estate agents in my local market are extremely successful without being associated with a traditional real estate brokerage. They have “disintermediated” the traditional brokerage… removing an extra layer in the transaction process.
Are national real estate brokerage franchises necessary in the real estate transaction process or are they a layer of overhead and expense that can be removed? 42% of agents already feel they are not worth the additional cost.
In 2014 my wife and I identified real estate as a market we felt would be disrupted by technology, and we started putting a strategy in place to capitalise on it. We visited with friends to understand their frustration with the existing real estate process, and we started developing a technology stack and a market strategy so that we could quickly grab market share as the industry transformed. We both became real agents, and then real estate brokers so that we could run our own brokerage if we choose to. In Q4 of 2019 we started our own brokerage so that we could be more agile and rapidly respond to the changes in the market we were seeing, and so that we could take complete control over the transaction process, and take over complete control of our market strategy and our marketing budget.
We feel we have created a more financially efficient and more market effective real estate company that delivers substantial value to our clients.
I can’t say with any degree of certainty how this will all play out, but looking at the market share data for real estate brokerages in my area of the world, it is clear that change is already occurring.
Having studied market disruption for over 25 years, I have noticed one thing all of the disruption cycles have had in common. First the disruption happens slowly, then it happens quickly as Network Effects and Flywheel Effects kick in.
Traditional Real Estate Brokerages are under extreme pressure. They are suffering from declining margins, losing top producing agents, and in my local area they are losing market share.
If real estate brokerages do not redefine themselves, start providing massive value to home buyers and sellers, and figure out how to bring substantial value to real estate agents they are at risk of having the same outcome as stock brokerages and travel agencies.
The drop-off in Facebook has been higher among younger users, but the decline is seen in every age and gender demographic as well. It’s not as if only young people, or older Americans, or women are using Facebook less. Every studied group is using Facebook less.
If you’ve found yourself spending less time on Facebook over the last year, you’re not alone. As the beleaguered company has battled scandal after scandal and tried to emphasize “meaningful” interactions over fake news and clickbait, users are spending less time on the service. The declining usage of Facebook explains the company’s recent decision to stop disclosing metrics for its main app and its shift toward private interactions.
It was assumed that Instagram was the bright spot for growth at Facebook, but a recent Bank of America report claims that mobile downloads of both Facebook and Instagram apps are declining.
The report, which cites data from research firm Sensor Tower, claims that combined downloads of Facebook and Instagram fell 13% year-over-year in the third quarter of 2019 so far. Facebook’s downloads declined 15%, while Instagram’s downloads dropped 9%.
Americans record lowest ‘satisfaction’ with Facebook since 2015, according to ASCI.
A steady stream of surveys and anecdotal evidence indicates Americans are less trusting and engaged with Facebook since the Cambridge Analytica data-privacy scandal broke in 2018. A report from ACSI explains Facebook trails all other social media sites in user satisfaction “by a wide gap.” This is not limited to privacy issues, according to the survey, it extends to site functionality, ads and content.
Users also find advertising on Facebook to be more intrusive than other sites. But the bad news doesn’t stop there. In two new ACSI measures for social media websites, Facebook falls dead last. Users feel that the ease of uploading photos and videos is subpar compared with other sites in the category. Moreover, users are frustrated with Facebook’s news feed as the site’s content rates worst in class for relevancy.
Engagement with Facebook is set to decline or remain flat for the foreseeable future, according to a new report from eMarketer.
Reasons for the decline in Facebook are:
1. Increased Distrust of Facebook
2019 has not been a great year for Facebook as a company. The media has consistently covered Facebook’s role in the propogation of “fake news.” When you combine their fake news problem with the company’s other problems with privacy and accountability you end up with an environment where the users of the platform may not fully trust the motives and judgment of the company that operate the platform.
Given that Facebook has access to many of its users’ most important personal data points, photos, and feelings a drop in trust is likely responsible for a drop in usage.
2. Increased Discord on Facebook
If you’re a Facebook user you have probably seen “friends” say they are logging off of Facebook for good because of the rampant negativity present on the platform.
In the shadow of the presidential election, there has been a continued polarization of thought in America, and an acceptance that the new normal is a climate of “us” vs. “them.” This is tiring. Each time you express an opinion on Facebook you must defend that opinion from segments of your “friends” who are now “the opposition.” This makes Facebook very unenjoyable to use.
3. Increased Disinterest in Facebook
I don’t know that it was ever “cool” to say you use Facebook… but now it feels increasingly not cool to admit you use it. Social Media platforms are a bit like fashion. One day you are hot, the next day you are not.
We’ve seen countless websites rise and fall in popularity, seemingly overnight, and a handful of sites that have stuck around for 20 years or even longer. I have grown increasingly used to the idea that most social platforms will eventually burn out or fade away. MySpace is a perfect example of this on the social media front. MySpace is still technically alive but it is nowhere close to the level of popularity it used to have.
Some of the decline in Facebook is probably due to a shift of users to other parts of the Facebook ecosystem. While Facebook’s usage declines, it will be important to see what what happens with Instagram, WhatsApp and Facebook Messenger.
After all, one of Facebook’s most attractive elements is that you can do a LOT of different things on the platform. But that’s also one of its great weaknesses. Is Facebook the BEST place for video? Probably not. Is it the BEST place for photos? Probably not. Is it the BEST place for messaging? Maybe.
When additional surveys and data are available I predict they will reveal an even greater drop in daily usage Facebook. While there are some people who have signed off of the platform entirely, I believe the bigger change is people using Facebook a couple times a week instead of every day.
Facebook lied about views by 150% to 900%. They caused media companies to go bankrupt resulting in the loss of jobs. Facebook gets away with a tiny fine and no acknowledgement of how they faked these metrics.
Another day, another example of the cesspool that is Facebook. It isn’t simply a coincidence that every time Facebook gets caught faking a metric it was benefiting the company financially. Facebook paid a record-breaking $5 billion fine as part of a settlement with the Federal Trade Commission investigation of the Cambridge Analytica scandal. This time, Facebook only has to pay a paultry $40 million dollar fine. That isn’t even a slap on the wrist as it is only about 0.18% of their annual income. Facebook will pay the fine without acknowledging any responsibility or culpability, as is to be expected of Mark Zuckerberg.
Advertisers sued Facebook in 2016 over user metrics that supposedly measured the average length of time consumers spent viewing posted video ads. The lawsuit said that the time was inflated by up to 900 percent and that helped convince advertisers to buy Facebook’s video advertising services.
Faced with claims of violating unfair competition law, breaching contract and committing fraud, Facebook contested advertisers’ injuries, questioning whether they really relied on these metrics in deciding to purchase ad time. In early rounds in the litigation, Facebook was successful in getting the judge to pare the claims, though until a settlement was announced, several of the claims including fraud were still live. Even after agreeing to pay $40 million for settlement, Facebook maintains the suit is “without merit.”
In 2017, approximately 98% of Facebook’s revenue was generated by advertising, euqal to about $39.9 Billion US Dollars. It has been reported that more than 50% of Facebook users are fake. That means that not only did Facebook generate revenue from over inflated video views, they are generating substantial revenue from fake users.
Recent Comments