Digital Experience Platforms (DXP):

Digital Experience Platforms (DXP)In the rapidly evolving digital landscape, businesses are navigating a complex terrain to engage users across various touchpoints. Digital Experience Platforms (DXP) have emerged as a core strategy for organizations seeking to optimize their digital interactions, ensuring seamless and immersive experiences for users. A DXP is an integrated suite of technologies that provide a comprehensive solution for managing the entire digital experience lifecycle.

  • 84% of customers say that being treated like a person is key to winning their business
  • The No. 1 reason customers switch brands is because they feel unappreciated

The technology that organizations use to establish and enhance customer experience (CX) continues to evolve. Digital experience platforms evolved from web content management system (CMS) to help marketers meet the needs of customer expectation. Digital eXperience Platforms (DXPs) now serve as the backbone for customer experience allowing brands to offer the personalized and relevant digital experiences customers expect today.

Key Capabilities of Digital Experience Platforms

  1. Content Management

    • DXPs offer robust content management systems (CMS) to create, edit, and publish content across different digital channels.
    • Personalization tools enable dynamic content based on user preferences, behavior, and demographics.
  2. Multi-Channel Delivery

    • DXPs facilitate consistent delivery of content across diverse channels, including websites, mobile apps, social media, and more.
    • Responsive design ensures optimal user experiences on various devices.
  3. User Analytics

    • Advanced analytics tools track user behavior, engagement patterns, and content performance.
    • Insights gathered help businesses make data-driven decisions to enhance user satisfaction.
  4. Commerce Integration

    • Seamless integration with e-commerce platforms allows businesses to offer a unified and efficient buying experience.
    • Features like product recommendations and personalized shopping carts enhance user engagement.
  5. Marketing Automation

    • DXPs provide robust marketing automation tools for targeted campaigns, email marketing, and lead nurturing.
    • Automation streamlines marketing efforts, fostering better user engagement and conversion rates.
  6. Social Media Integration

    • Social media features enable content sharing, user-generated content integration, and social listening.
    • DXPs leverage social channels for brand promotion and community building.
  7. Customer Relationship Management (CRM)

    • Integration with CRM systems enables businesses to manage customer relationships effectively.
    • Unified customer data allows for personalized interactions and improved customer service.
  8. Search and Navigation

    • Advanced search capabilities and intuitive navigation enhance user experience.
    • AI-powered search algorithms provide relevant and quick results.


Popular Digital Experience Platform Providers

  • Adobe Experience Cloud

    Adobe Experience Cloud and Adobe Experience Manager (AEM) are products from Adobe that help businesses create and manage digital experiences. Adobe Experience Cloud is a suite of cloud-based marketing and analytics solutions that help businesses create, manage, and measure their digital experiences. It includes a wide range of products, such as Adobe Target, Adobe Analytics, and Adobe Campaign. Adobe Experience Cloud is designed to help businesses understand their customers, personalize their marketing campaigns, and measure the results of their efforts.

  • Sitecore

    Sitecore Digital Experience Platform (DXP) is a cloud-native platform that helps organizations create, manage, and deliver personalized digital experiences across multiple channels. It provides a comprehensive set of tools for content management, marketing automation, commerce, and analytics.

  • Salesforce Experience Cloud

    Salesforce Experience Cloud is a suite of cloud-based digital experience (DX) solutions that helps businesses create, manage, and deliver personalized experiences across multiple channels, including websites, mobile apps, portals, and communities. It provides a comprehensive set of tools for customer experience (CX), marketing, commerce, and analytics.

  • Oracle Cloud CX

    Oracle Cloud CX, formerly known as Oracle Advertising and Customer Experience (CX), is a cloud-based suite of applications that help businesses create, manage, and deliver personalized customer experiences across multiple channels. It includes a range of products, such as Oracle Marketing Cloud, Oracle Sales Cloud, Oracle Service Cloud, and Oracle Commerce Cloud.

  • Acquia

    Acquia Digital Experience Platform (DXP) is a cloud-based platform that helps businesses create, manage, and deliver personalized digital experiences across multiple channels. It is a comprehensive suite of tools that includes content management, marketing automation, commerce, and analytics. The core Acquia Drupal Cloud platform is based on the open-source Drupal CMS, but the Acquia DXP also includes a number of proprietary modules and extensions that are not open source.

  • Liferay

    The Liferay DXP is a powerful and flexible platform that can be used to build a wide variety of digital experiences, including websites, portals, intranets, and mobile apps. It is a popular choice for enterprises that need a scalable and extensible platform that can support their digital transformation initiatives. The core Liferay Portal platform is available under the LGPL v3 license, which is an open-source license that allows for free modification and distribution of the software. Liferay DXP modules and many of the Liferay DXP extensions are proprietary.

What is a composable DXP?

A composable DXP, or Digital Experience Platform, is a modular approach to building digital experiences that utilizes best-of-breed components from various sources. It contrasts with the traditional monolithic DXP, which provides a fixed set of features and functionality.

A composable DXP offers several advantages, including:

  • Flexibility: Composable DXPs allow organizations to select and combine components that align with their specific needs and requirements. This flexibility enables them to create tailored digital experiences that match their unique brand and user personas.
  • Agility: Composable DXPs facilitate faster development and deployment of digital experiences. By leveraging pre-built components, organizations can quickly assemble and adapt their digital platforms to meet evolving market trends and customer demands.
  • Scalability: Composable DXPs are inherently scalable, allowing organizations to seamlessly add or remove components as their needs change. This scalability ensures that their digital platforms can grow alongside their business.
  • Cost-effectiveness: Composable DXPs can potentially reduce costs by eliminating the need for expensive custom development and maintenance of a monolithic platform. Organizations can instead focus on selecting and integrating components that provide the best value for their money.

Conclusion

In the era of digital transformation, Digital Experience Platforms play a pivotal role in orchestrating cohesive and engaging user experiences. By integrating various technologies into a unified solution, DXPs empower businesses to adapt to changing user expectations, foster brand loyalty, and drive engagement and growth in the digital realm. The selection of a DXP should align with the specific needs and goals of each business, ensuring a tailored and effective digital strategy.


The Dead Internet Theory: Separating Fact from Fiction

The Dead Internet TheoryIn recent years, a concept known as the “Dead Internet Theory” has emerged, suggesting that the internet as we know it is slowly dying or becoming obsolete. This theory has sparked debates and raised questions about the future of the digital age. In this article, we will explore the Dead Internet Theory, its origins, its arguments, and the reality behind this provocative concept.

Understanding the Dead Internet Theory:

The Dead Internet Theory posits that the internet is losing its original spirit of openness, freedom, and decentralization. Proponents argue that increasing corporate control, government surveillance, censorship, and the dominance of a few powerful entities are stifling innovation and transforming the internet into a highly regulated and centralized platform. They claim that the internet is losing its vitality and becoming a mere reflection of offline power structures.

Origins and Key Arguments:

The origins of the Dead Internet Theory can be traced back to concerns raised by early internet pioneers and activists who championed a free and open internet. They feared that the commercialization and consolidation of online platforms would undermine the principles that made the internet a transformative force.

Among the concerns put forth by the Dead Internet Theory are fake traffic from bots, and fake user accounts. According to numerous sources, a significant percent of Internet traffic is from bots. Estimates range from a low of 42% of traffic, to over 66% of traffic is from bots. Bots are automated software programs that perform various tasks, ranging from simple to complex, on the internet. They can be beneficial, such as search engine crawlers or chatbots, or malicious, like spam bots or distributed denial-of-service (DDoS) attack bots.

The problems associated with bot traffic arise primarily from malicious bots that engage in fraudulent activities, spamming, data scraping, click fraud, credential stuffing, and more. These activities can have severe consequences, including financial losses, compromised security, reputational damage, and disruptions to legitimate online services.

Social media platforms have been grappling with the challenge of fake user accounts for some time. These accounts are created for various purposes, including spreading misinformation, engaging in spam or fraudulent activities, manipulating public opinion, or conducting malicious campaigns. Increasingly, bots use these fake accounts “push” a narrative. This activity is very noticeable during political elections.

Proponents of The Dead Internet Theory highlight several key arguments:

  • Centralization and Monopolistic Power: They argue that a small number of tech giants now dominate the internet, controlling vast amounts of data and shaping user experiences. This concentration of power limits competition and stifles smaller players’ ability to innovate.
  • Surveillance and Privacy Concerns: With the rise of surveillance technologies and data breaches, privacy advocates express worry that individuals’ online activities are constantly monitored and exploited for various purposes, eroding trust in the internet.
  • Censorship and Content Control: The theory also highlights instances of government-imposed censorship, content moderation challenges, and algorithmic biases, suggesting that freedom of expression is under threat.
  • Net Neutrality and Access: Advocates argue that the internet’s openness is compromised by practices that prioritize certain types of traffic or restrict access based on geographic location or socioeconomic factors, leading to a digital divide.

The Reality:

While the concerns raised by the Dead Internet Theory hold some validity, it is important to approach the subject with nuance. The internet remains a dynamic and evolving medium, shaped by technological advancements and societal changes. While challenges exist, numerous initiatives and movements aim to preserve the internet’s founding principles.

Efforts such as decentralized technologies (like blockchain), open-source software, encryption tools, and net neutrality advocacy strive to counteract centralization, surveillance, and censorship. Additionally, the proliferation of alternative platforms, social networks, and online communities ensures that diverse voices and opinions can still find a space online.

Final Thoughts:

The Dead Internet Theory serves as a reminder of the ongoing struggle to maintain an open, free, and decentralized internet. While concerns over centralization, surveillance, and censorship are valid, the internet is not irreversibly “dead.” It continues to evolve, driven by the collective actions of individuals, organizations, and policymakers. It is important that the internet remains a powerful tool for connectivity, knowledge-sharing, and empowerment.


How Voice Technology and Chatbots will Influence Marketing in the Future

Voice Technology & ChatbotsThe number of people using conversational AI tools, such as voice assistants and chatbots, is rapidly growing and redefining the relationship of users with technology. Voice technology has become the most disruptive force to hit the world since the internet became a visual medium. Voice assistants and chatbots have become an additional interface for marketing purposes. It brings an entirely new way of interacting with customers and adds a higher value to their experience.

123.5 million US adults will use voice assistants at least once per month in 2022. That number is expected to grow to approximately 50% of all US adults in the next 3 years.

Some of the most well-known and widely used voice-activated assistants include:

  • Alexa, developed by Amazon
  • Google Assistant, developed by Google
  • Siri, developed by Apple
  • Cortana, developed by Microsoft
  • Bixby, developed by Samsung

ChatGPT is a variant of the GPT (Generative Pre-training Transformer) language model, which was developed by OpenAI. It is a machine learning model that is designed specifically for chatbot applications and is able to generate natural-sounding responses to a wide variety of inputs. The goal of ChatGPT is to make AI systems more natural and safe to interact with. The services ammassed millions of users in less than 1 month.

While ChatGPT is not a traditional voice-activated assistant like Alexa or Siri, it is capable of generating text-based responses to user inputs and can be used in a chatbot application that allows users to communicate with it using natural language. It is not designed to be used with voice input, but rather is intended to generate text responses that can be displayed on a screen or presented to the user in some other way.

ChatGPT surpassed over 1 million users in just 5 days. To put that in perspective it took Netflix over 41 months, Facebook 10 months, and Instagram 2.5 months to achieve the over 1 million users.

Google’s management has reportedly issued a ‘code red’ amid the rising popularity of the ChatGPT AI. The move comes as talks abound over whether ChatGPT could one day replace Google’s search engine.

Sridhar Ramaswamy, who oversaw Google’s ad team between 2013 and 2018, said that ChatGPT could prevent users from clicking on Google links with ads, which generated $208 billion in 2021(81% of Alphabet’s overall revenue).

Google has a similar technology called LAMBDA. LAMDA is a machine learning model developed by Google for natural language processing tasks. It is a variant of the Transformer model, which is a type of neural network architecture that is particularly well-suited to processing and generating large amounts of text data.

LAMBDA is an acronym that stands for “Language Model-Agnostic Meta-Learning.” It is designed to be able to quickly adapt to new tasks and languages by learning from a small amount of labeled data, and it has been shown to be effective at a wide range of natural language processing tasks, including language translation, language modeling, and text classification.

Google LAMBDA is part of the Google AI research group and is used in a variety of applications, including machine translation, language understanding, and chatbot development. It is a powerful and flexible machine learning model that has the potential to significantly advance the state of the art in natural language processing.

Google is hesitant to release its AI chatbot LAMDA to the public in its current state over concerns over “reputational risk” due to its high margin of error and vulnerability to toxicity. Like most technologies, this tech can be abused.

Integrating voice assistanta and chatbots into your marketing strategy isn’t easy.

Your content marketing and editorial strategies should reflect how your business plans to leverage the technology and how invested you are in it from a content point of view. For most businesses voice and chatbot marketing should start with “search”. 2022 saw over one billion voice searches conducted in a month. Content marketers must emphasize short-form content products that offer quick and crisp answers to users to engage consumers.

SEO techniques can be used for voice activated search results becuase search results from Siri, Cortana, and others frequently use Google’s featured snippets. When focusing on voice-search optimization, it is essential to remember that the virtual assistant can only deliver a single search result per request. Marketers should adopt SEO guidelines related to spoken word search behaviors and informational needs.

Please remember that voice assistants and chatbots will soon become an additional interface for marketing purposes. It brings an entirely new way of interacting with customers and adds a higher value to their experience.


WHAT IS WEB 3.0?

For too long a few large corporations(aka: Big Tech) have dominated the Internet. In the process they have taken control away from users. Web 3.0 could put an end to that, with many additional benefits.

What is Web 3.0?
The Big Tech companies take user’s personal data, track their activity across websites, and use that information for profit. If these large corporations disagree with what a user posts, they can censor them even if the user posts accurate information. In many cases, users are permanently blocked(deplatformed) from using a website or platform. These large corporations use creative marketing strategies, aggressive business deals, and lobbyists to consolidate their power and promote proprietary technology at the expense of Open Source solutions and data privacy.

If you care about regaining ownership of your personal data, you should be excited about Web 3.0. If you want an Internet that provides equal benefit to all users, you should be excited about Web 3.0. If you want the benefits of decentralization, you should be excited about Web 3.0.

To understand Web 3.0, it is beneficial to have a brief history of the Internet.

Web 1.0 (1989 – 2005)

Web 1.0 was the first stage of the internet. It is also frequently called the Static Web. There were a small number of content creators and most internet users were consumers. It was characterized by few people creating content and more people on the internet consuming content. It was built on decentralized and community-governed protocols. Web 1.0 was mostly static pages that had limited functionality and were commonly hosted by Internet Service Providers(ISP).

Web 2.0 (2005 – 2022)

Web 2.0 was also called the Dynamic Internet and the Participative Social Web. It was primarily driven by three core areas of innovation: Mobile, Social, and Cloud Computing. The introduction of the iPhone in 2007 brought about the mobile and drastically broadened both the user-base and the usage of the Internet.

The Social Internet was introduced by websites like Friendster(2003), MySpace(2003) and Facebook(2004). These social networks coaxed users into content generation, including sharing photos, recommendations, and referrals.

Cloud Computing commoditised the production and maintenance of internet servers, web pages, and web applications. It introduced concepts like Virtual Servers, and Software as a Service(SaaS). Amazon Web Services(AWS) launched in 2006 by releasing the Simple Storage Service(S3). Cloud computing is Web-based computing that allows businesses and individuals to consume computing resources such as virtual machines, databases, processing, memory, services, storage, messaging, events, and pay-as-you-go. Cloud Computing continues to grow rapidly driven by advanced technologies such as Artificial Intelligence(AI), Machine Learning(ML), and the continued adoption of cloud based solutions by enterprises.

Unfortunately Web 2.0 also brought about centralization, surveillance, tracking, invasive ads, censorship, deplatforming and the dominance of Big Tech.

Web 3.0(In Progress)

Web 3.0 is still an evolving and broad ranging concept but rapid progress is being made. It was originally thought that the next generation of the Internet would be the Semantic Web. Tim Berners-Lee(known as the inventor of the World Wide Web) coined the term to describe a more intelligent Internet in which machines would process content in a humanlike way and understood information on websites both contextually and conceptually. Some progress was made on helping machines understand concept and context via metadata markup standards like Microformats. Web 3.0 was to also be a return to the original concept of the web, a place where one does not need permission from a central authority to post, there is no central control, and there is no single point of failure.

Those are idealistic goals, but recent technology developments like Blockchain have expanded the idea of what Web 3.0 could represent. The framework for Web 3.0 was expanded to include decentralization, self-governance, artificial intelligence, and token based economics. Web 3.0 includes a leap forward to open, trustless and permissionless networks.

The rise of technologies such as distributed ledgers and storage on blockchain will allow for data to be decentralized and will create a secure and transparent environment. This will hopefully put and end to most of Web 2.0’s centralization, surveillance, and exploitative advertising.

The adoption of Cryptocurrency and Digital Assets are also a major part of Web 3.0. The monetization strategy of Big Tech selling user data was introduced by Web 2.0. Web 3.0 introduces monetization and micropayments using cryptocurrencies. This can be used to reward developers and users of Decentralized Applications(DApp). This ensures the stability and security of a decentralized network. Web 3.0 gives users informed consent when selling their data, and gives the profits back to the user via digital assets and digital currency. Cryptocurrencies also use Decentralized finance(DeFi) solutions to implement cross-border payments more effectively than traditional payment channels. The Metaverse and real time 3D Worlds are also part of what is envisioned for Web 3.0.

If you understand the benefits of decentralization, privacy, and open solutions it’s time to recognize the importance of Web 3.0!


The Slow Death of Facebook Page Organic Reach

The slow and steady decline(death) in organic Facebook reach began in 2014, when Brian Boland, Facebook’s VP of Advertising Technology (at the time), said Facebook is simply managing more ad content than it used to, and News Feed space is thus more competitive among marketers.

The Slow Death of Facebook Page Organic Reach
As a result, brands have to spend more on paid advertising to get the same reach they use to enjoy by posting on their Facebook Page. This trend started in 2014 and has steadily declined every year since then. At the time of this article, it would surprising if a business or brand could generate meaningful organic reach from posts on their Facebook page.

Facebook still has the largest user base among all the different social media channels. With over 2 billion users worldwide (and counting), marketers are hoping they’ll find their target audience here. Keep in mind that it is estimated that more than 50% of Facebook user accounts are fake.

However, it is increasingly difficult to reach a target audience on Facebook by just posting to a business’s Facebook Page. This benefits Facebook financially as marketers must turn to boosted posts and advertising which generate revenue for Facebook.

What is Facebook Organic Reach?

Organic Reach on Facebook is simply a measurement of how many people can find you on Facebook for free. It’s much like organic rankings on a search engine, although in the case of Facebook it’s based on aspects like popularity, post frequency, and other contributing factors. Organic reach is the number of users that see a post that has not been promoted or “boosted” using Facebook’s advertising solutions.

Why is Facebook Organic Reach Declining?

According to Mark Zuckerberg, there’s a good reason for the death of Facebook’s Organic Reach:

“Recently we’ve gotten feedback from our community that public content — posts from businesses, brands, and media — is crowding out the personal moments that lead us to connect more with each other.”

Specifically, Zuckerberg wants Facebook to be better geared to curate content that builds meaningful relationships. Despite the substantial drop in Organic Reach beginning in 2014, Facebook and Zuckerberg still think that there’s too much Organic Reach for a Page.

With Organic Reach now below 2% Facebook is nearing the end of the road in terms of being a viable option for marketing your brand or reaching your target market on social media.

With Organic Reach being almost non existent now, engagement on posts on Facebook Pages is also almost non existent.

Category : BriteWire Blog


The Decline of the Traditional Real Estate Brokerage

Real Estate Brokerages have been ripe for disruption for a long time. Until recently, traditional real estate brokerages have been spared some of the enormous disruptions that have redefined other industries, but they are under increasing margin pressure, losing top producing agents, and losing market share.

The Decline of the Traditional Real Estate BrokerageI have spent most of my career analyzing market disruption, disintermediation, network effects, and flywheel effects. I have studied several different markets as new entrants or technologies took on legacy business practices, and dramatically disrupted the traditional way of doing business.

A few examples of new companies disrupting entrenched legacy business models are:

  • E-Trade vs. Stockbrokerages
  • Expedia vs. Travel Agencies
  • Amazon vs. Retail Stores
  • UBER vs. Taxi’s
  • Netflix vs. Blockbuster & Cable Television
  • AirBNB vs. The Hotel Industry

I have also observed this when new technologies took on legacy technologies. Examples include:

  • Mobile Phones vs. Land Line Telephones
  • Open Source Software vs. Closed Source Proprietary Software
  • Digital Advertising vs. Print Advertising

When possible, I try to take a position in markets that are undergoing some sort of major disruption, as it represents a tremendous financial opportunity if you play it correctly. It should be pointed out that this is rarely a ZERO SUM GAME. Yes, many legacy approaches will go away, but some end up re-inventing themselves so that they can succeed in the new competitive landscape. For example, I know of several really good travel agencies that are thriving by developing deep offerings in specialized market segments like fly fishing trips and adventure travel.

In my local real estate market(Bozeman, Montana) the large national franchised real estate brokerages have been losing market share for several years. (see chart). The lack of consolodation among these national real estate brokerges is an indicator of a market undergoing not only a market transformation, but potentially a major disruption.

Real Estate Brokerage Market Share - Bozeman, Montana
Market share is in decline for the 6 leading national real estate brokerage franchises operating in Bozeman. The 6 brands represented in the chart are Berkshire Hathaway, Keller Williams, ERA, Christie’s International Real Estate, RE/MAX, and Sotheby’s International. The goal of the article isn’t to single out any specific real estate brand or business, but analyse the broader market trends.

The large real estate brokerage franchises are not only losing market share in my local market, they are facing increasing pressure on operating margins as their leverage against real estate agents declines. Residential Real Estate Brokerages are a narrow margin business with lots of competitiion for top talent. As the number of options available to top producing agents increases, these agents are successfully negotiating more favourable commission split arrangements with the brokerage and establishing commission caps that place a limit on the amount of commission they pay the brokerage.

In order to retain top producing agents(the lifeblood of revenue for a real estate brokerage), real estate brokerages are having to offer better commission splits, nicer offices, and invest in expensive Internet technology and marketing solutions. This is negatively impacting margins and profits… and many top producing agents are still leaving to join boutique brokerages or go out on their own as true independant agents.

Real Estate Brokerages are very aware of the problem, and the competitive threats. Gary Keller(founder of Keller Williams) said in 2018 that Real Estate Brokerages have to dramatically re-define themselves in order to survive… and he said they have less than 5 years to figure it out. Keller Williams and Coldwell Banker are both making massive investments in developing new proprietary technology in an attempt to compete effectively in the new landscape of residential real estate.

There are many new brokerage concepts popping up around the country competing for top producing real estate agents. 100% models, profit sharing models, hybrid models that are a mixture of both. There are real estate brokerages like Compass that deploy capital(from lead investor SoftBank) as a competitive strategy to acquire market share and try to position themselves as a technology company. There are also the “iBuyers” like OpenDoor and Offerpad that are attempting to take away many of the “pain points” associated with selling a home. Many top performing agents are leaving traditional real estate brokerages and going out on their own, or joining co-operative type arrangements with other top performing real estate agents. According to NAR only 42% of REALTORS® are affiliated with a brokerage franchise in 2019.

Looking at the stock prices for a couple publicly traded real estate brokerage franchisors reveals how investors feel about their financial viability.

Realogy Holdings Corp. is an American publicly owned real estate and relocation services company. Realogy is the leading global franchisor of some of the most recognized brands in real estate including Sotheby’s International Realty, Coldwell Banker, Corcoran, ERA Real Estate, Century 21, and Better Homes and Gardens Real Estate among others.

Realogy estimates that for all U.S. existing homesale transactions in which a broker was involved, Realogy had approximately 16% market share of transaction dollar volume and approximately 13.5% market share of transactions in 2017.

That is an impressive portfolio of real estate brands. However, investors have punished the stock. In 2013 Realogy(NYSE: RLGY) traded at a high of $53.53 per share valuing the company at over $6 Billion. In 2019 the stock was trading at $4.93 per share valuing the company at less than $600 Million… less than 1/10th its highest valuation.

Re/Max Holdings Inc is another American international real estate company that is publicly traded and operates through a franchise system. From 1999 until 2013 the company held the number one market share in the United States as measured by residential transaction sides.

Re/Max Holdings Inc (NYSE: RMAX) traded at a high of $67.20 per share in 2017 valuing the company at $1.19 Billion. In 2019 the stock traded for $25.67 valuing the company at $457 Million… less than 1/2 what it was at its high.

At the time of this article both of those stocks are trading up from the low prices they hit earlier in 2019. To be determined is if that is a “dead cat bounce.” In finance, a dead cat bounce is a small, brief recovery in the price of a declining stock

Looking at the 2019 Residential Real Estate Franchise Report reveals the Franchise Fees for these well known real estate brokerage franchises range from $25K to $35K with ongoing Royalty Fees of between 5% and 6% of gross commissions, and Monthly Marketing/Advertising Fees of 1%. If these real estate brokerage franchises were delivering tremendous value, they would be able to command higher franchise fees, and their stocks would be trading at much higher valuations.

Technology is currently empowering real estate agents at the expense of the brokerage, but it is also predicted to reduce the importance of both real estate agents and brokerages over time. According to researchers at Oxford University, the potential for artificial intelligence computer algorithms to replace real estate brokers is estimated at 97%. The future of the real estate brokerage is under immense pressure, and is changing fast!

Why Are Traditional Real Estate Brokerages In Decline?

Over the last 15+ years almost everything about residential real estate has gone online. This has changed the consumer buying and selling behavior. Previously a real estate brokerage controlled access to information about properties for sale. Consumers used to think the real estate brokerage with the most signs around town or the most advertisements in the local newspaper was the best option.

The brand of the real estate brokerage an agent was associated with used to matter to the consumer. As a result, the value the brokerage brought to the agent was significant.

Today, consumers do not make decisions based on the brand of the real estate brokerage, they choose the real estate agent they think will do the best job for them. The brand of the real estate brokerage increasingly has little to do with the transaction other than cash the check and keep a percentage of the commission. It is the interaction with the real estate agent and the agent’s expertise that creates consumer satisfaction, not the brand of the real estate brokerage.

The National Association Of REALTORS reported in 2016 that only 2% of consumers chose their agent based on the brokerage brand they are affiliated with.

In an effort to retain top producing agents, many brokerages are still trying to convince agents that the brand of the brokerage is important, and that their brand has significant value with consumers.

The reality is that consumers do not care about the brokerage brand an agent is associated with.

The brokerage used to be the consumer’s first stop. Now the consumer’s first step is to go online, and find an agent they want to work with.

Today, information on homes for sale can easily be accessed by looking at websites like Zillow, Trulia, Redfin, Realtor.com, and hundreds of other property search related websites. Consumers no longer consider a brokerage a significant component in the value chain associated with buying or selling a property.

When it comes to data, the largest and most valuable data or information in the real estate industry is basically off limits to brokerage firms. The most valuable data is client and customer data, and that is owned by the agents and is typically closely guarded by them. Brokerages have access to data about listings, sales, revenues and costs. That information is valuable, but the customer data owned by the agents is much more valuable.

I have spent the last 5 years looking at technology solutions for real estate agents. The best solutions tend to be marketed at individual agents, or teams of agents, not real estate brokerages. When agents purchase these marketing tools and take over their own technology stack, the grip that the brokerage has on the agent gets even weaker.

As a result real estate brokerages are not only increasingly losing their influence over consumers, they are also increasingly losing their leverage with real estate agents… especially top performing real estate agents.

What Does The Real Estate Brokerage Of The Future Look Like?

I try not to make public predictions… preferring to make private investments and get positioned to benefit financially for how I predict markets will evolve.

However, you can look at the way markets have behaved in the past to predict the future. One of the Immutable Laws Of Markets is that over the long term they seek efficiency. This usually leads to disintermediation(the removal of unnecessary intermediaries in the business process). Many of the top performing real estate agents in my local market are extremely successful without being associated with a traditional real estate brokerage. They have “disintermediated” the traditional brokerage… removing an extra layer in the transaction process.

Are national real estate brokerage franchises necessary in the real estate transaction process or are they a layer of overhead and expense that can be removed? 42% of agents already feel they are not worth the additional cost.

In 2014 my wife and I identified real estate as a market we felt would be disrupted by technology, and we started putting a strategy in place to capitalise on it. We visited with friends to understand their frustration with the existing real estate process, and we started developing a technology stack and a market strategy so that we could quickly grab market share as the industry transformed. We both became real agents, and then real estate brokers so that we could run our own brokerage if we choose to. In Q4 of 2019 we started our own brokerage so that we could be more agile and rapidly respond to the changes in the market we were seeing, and so that we could take complete control over the transaction process, and take over complete control of our market strategy and our marketing budget.

We feel we have created a more financially efficient and more market effective real estate company that delivers substantial value to our clients.

I can’t say with any degree of certainty how this will all play out, but looking at the market share data for real estate brokerages in my area of the world, it is clear that change is already occurring.

Having studied market disruption for over 25 years, I have noticed one thing all of the disruption cycles have had in common. First the disruption happens slowly, then it happens quickly as Network Effects and Flywheel Effects kick in.

Traditional Real Estate Brokerages are under extreme pressure. They are suffering from declining margins, losing top producing agents, and in my local area they are losing market share.

If real estate brokerages do not redefine themselves, start providing massive value to home buyers and sellers, and figure out how to bring substantial value to real estate agents they are at risk of having the same outcome as stock brokerages and travel agencies.


Facebook Continues To Lose Users In Key Markets in 2019

The drop-off in Facebook has been higher among younger users, but the decline is seen in every age and gender demographic as well. It’s not as if only young people, or older Americans, or women are using Facebook less. Every studied group is using Facebook less.

Facebook Decline In Usage 2019If you’ve found yourself spending less time on Facebook over the last year, you’re not alone. As the beleaguered company has battled scandal after scandal and tried to emphasize “meaningful” interactions over fake news and clickbait, users are spending less time on the service. The declining usage of Facebook explains the company’s recent decision to stop disclosing metrics for its main app and its shift toward private interactions.

It was assumed that Instagram was the bright spot for growth at Facebook, but a recent Bank of America report claims that mobile downloads of both Facebook and Instagram apps are declining.

The report, which cites data from research firm Sensor Tower, claims that combined downloads of Facebook and Instagram fell 13% year-over-year in the third quarter of 2019 so far. Facebook’s downloads declined 15%, while Instagram’s downloads dropped 9%.

Americans record lowest ‘satisfaction’ with Facebook since 2015, according to ASCI.

A steady stream of surveys and anecdotal evidence indicates Americans are less trusting and engaged with Facebook since the Cambridge Analytica data-privacy scandal broke in 2018. A report from ACSI explains Facebook trails all other social media sites in user satisfaction “by a wide gap.” This is not limited to privacy issues, according to the survey, it extends to site functionality, ads and content.

Users also find advertising on Facebook to be more intrusive than other sites. But the bad news doesn’t stop there. In two new ACSI measures for social media websites, Facebook falls dead last. Users feel that the ease of uploading photos and videos is subpar compared with other sites in the category. Moreover, users are frustrated with Facebook’s news feed as the site’s content rates worst in class for relevancy.

Engagement with Facebook is set to decline or remain flat for the foreseeable future, according to a new report from eMarketer.

Reasons for the decline in Facebook are:

1. Increased Distrust of Facebook

2019 has not been a great year for Facebook as a company. The media has consistently covered Facebook’s role in the propogation of “fake news.” When you combine their fake news problem with the company’s other problems with privacy and accountability you end up with an environment where the users of the platform may not fully trust the motives and judgment of the company that operate the platform.

Given that Facebook has access to many of its users’ most important personal data points, photos, and feelings a drop in trust is likely responsible for a drop in usage.

2. Increased Discord on Facebook

If you’re a Facebook user you have probably seen “friends” say they are logging off of Facebook for good because of the rampant negativity present on the platform.

In the shadow of the presidential election, there has been a continued polarization of thought in America, and an acceptance that the new normal is a climate of “us” vs. “them.” This is tiring. Each time you express an opinion on Facebook you must defend that opinion from segments of your “friends” who are now “the opposition.” This makes Facebook very unenjoyable to use.

3. Increased Disinterest in Facebook

I don’t know that it was ever “cool” to say you use Facebook… but now it feels increasingly not cool to admit you use it. Social Media platforms are a bit like fashion. One day you are hot, the next day you are not.

We’ve seen countless websites rise and fall in popularity, seemingly overnight, and a handful of sites that have stuck around for 20 years or even longer. I have grown increasingly used to the idea that most social platforms will eventually burn out or fade away. MySpace is a perfect example of this on the social media front. MySpace is still technically alive but it is nowhere close to the level of popularity it used to have.

Some of the decline in Facebook is probably due to a shift of users to other parts of the Facebook ecosystem. While Facebook’s usage declines, it will be important to see what what happens with Instagram, WhatsApp and Facebook Messenger.

After all, one of Facebook’s most attractive elements is that you can do a LOT of different things on the platform. But that’s also one of its great weaknesses. Is Facebook the BEST place for video? Probably not. Is it the BEST place for photos? Probably not. Is it the BEST place for messaging? Maybe.

When additional surveys and data are available I predict they will reveal an even greater drop in daily usage Facebook. While there are some people who have signed off of the platform entirely, I believe the bigger change is people using Facebook a couple times a week instead of every day.

Stay tuned for more updates on this topic.

Category : BriteWire Blog


Facebook Busted Artificially Inflating Video Views

Facebook lied about views by 150% to 900%. They caused media companies to go bankrupt resulting in the loss of jobs. Facebook gets away with a tiny fine and no acknowledgement of how they faked these metrics.

Facebook Busted Artificially Inflating Video ViewsAnother day, another example of the cesspool that is Facebook. It isn’t simply a coincidence that every time Facebook gets caught faking a metric it was benefiting the company financially. Facebook paid a record-breaking $5 billion fine as part of a settlement with the Federal Trade Commission investigation of the Cambridge Analytica scandal. This time, Facebook only has to pay a paultry $40 million dollar fine. That isn’t even a slap on the wrist as it is only about 0.18% of their annual income. Facebook will pay the fine without acknowledging any responsibility or culpability, as is to be expected of Mark Zuckerberg.

Advertisers sued Facebook in 2016 over user metrics that supposedly measured the average length of time consumers spent viewing posted video ads. The lawsuit said that the time was inflated by up to 900 percent and that helped convince advertisers to buy Facebook’s video advertising services.

Faced with claims of violating unfair competition law, breaching contract and committing fraud, Facebook contested advertisers’ injuries, questioning whether they really relied on these metrics in deciding to purchase ad time. In early rounds in the litigation, Facebook was successful in getting the judge to pare the claims, though until a settlement was announced, several of the claims including fraud were still live. Even after agreeing to pay $40 million for settlement, Facebook maintains the suit is “without merit.”

In 2017, approximately 98% of Facebook’s revenue was generated by advertising, euqal to about $39.9 Billion US Dollars. It has been reported that more than 50% of Facebook users are fake. That means that not only did Facebook generate revenue from over inflated video views, they are generating substantial revenue from fake users.

Category : BriteWire Blog


Internet Content Marketing Strategy: Aligning Context & Intent

When defining a Content Strategy for Internet Marketing you need to align Context with User Intent.

A frequent problem I encounter when analyzing a website’s content marketing strategy is that the marketing team is creating content without fully thinking about Context and Intent.

To understand Content Context in its simplest form, let us take a basic one word search query like “apple.”

This search term can be used in two different contexts. 1) apple – the fruit 2) Apple – the company.

Internet Marketing - Context and IntentWe know that search engines like Google understand Content Context because they have Contextual Advertising Networks like Google AdSense that successfully place fruit advertisements on pages dealing with fruit, and laptop advertisements on pages dealing with computers.  It would not be effective to advertise Macintosh Laptop Peripherals on a page with an apple pie recipe. Context is important.

Contextual factors also strongly influence the interpretation of a search query and the results that are returned. Contextual search is a form of optimizing Internet search results based on context provided by the user(User Context). Contextual search attempts to increase the precision of search results based on how valuable they are to individual users. Contextualized search coincides with the increasing popularity of using mobile phones when searching, and a user’s search history. The easiest user context to understand is a user’s current location, and how that impacts search results. Think “Local Search.”

That is User Context. Context also includes Content Context. You can think of Content Context as a component of natural language understanding. Search queries like “movie times” and “movie reviews” can be related to a particular topic “movies”. These can be grouped into what are referred to as Context Clusters. This how search engines like Google can return web pages for a search result even though the exact search term does not appear on the page. They understand the context of what is being searched for.

Content Context Clusters can be explained by going back to the example of searching for the word “apple”. A page with words like apple macbook, apple watch, and apple airpods on it probably is not discussing apple the fruit and is more likely discussing Apple the company, and Apple’s products.

When a search engine like Google offers a user search suggestions as they begin typing a query, one or more context clusters may be presented to the user based on a context cluster probability. The context cluster probability is indicative of a probability that at least one query input that belongs to the context cluster will be selected by the user. A list of queries grouped into the context cluster may be presented as options for a query input selection.

As mentioned earlier, a user’s search history can also effect search context and that takes us into the topic of User Intent.

So far we have discussed Context(both User Context, and Content Context). Next, we need to take a look at Intent.

Search Intent seeks to understand the reason why people conduct a specific search. Why are they searching? Are they searching because they have a question and want an answer to that question? Are they searching for a specific website? Are they searching because they want to buy something or read reviews before buying it?

With the Google Hummingbird and Google RankBrain algorithms, the Google search engine can interpret search intent and display results that meet the user’s search intent.

3 Types Of Search Intent

User Search Intent can be categorized in 3 broad areas.

Informational Intent: To know something. The user wants to answer a specific question. These queries will include “how to”, “what is”, “where is”, and “why do”.  Content in this category includes tutorials, and introduction articles.

Navigational Intent: To find something. The user wants to find a specific website or location. Examples of these queries are “closest gas station”, or “facebook.”  Google Local Search is a great example of how Google is presenting content based on a user’s search intent.

Transactional Intent: To buy something. The user wants to purchase a product. Most keywords that have high commercial intent will fall into this category. Look for keywords like ‘buy’, ‘online store’, and ‘shipping’.

You could also have a 4th category called Commercial Investigation. This is the intent to research something prior to purchasing it. The user may want to purchase a product in the future, but wants to research the product first by reading product reviews or learn more about how a product works. I still lump this into Transactional Intent for content creation strategies as I view it as part of the Consumer Funnel. As users move closer to the actual act of buying, their searches become more precise(less ambiguous) and are easier to optimize content around.  These queries include queries with the words “best”, “review”, “top 10”, etc.

How to Optimize Content for Context and Intent

Knowing a bit about Context and Intent can help guide your Content Marketing Strategies. First, try to plan your content creation to satisfy specific user intentions driving their search queries. Understand related keywords and include them in the content so that search engines can determine the context of the content. If the content is on a narrow subject that could be ambiguous or may be searched for using ambiguous search terms, include a broader background of the subject within the article’s content that will help establish the context of the information and include meaningful related terms or phrases within the content that are less ambiguous.

Building upon the earlier example of “apple”, if you are creating an article on apple – the fruit you would include related terms like orchard, tree, and popular varities of apples like Cortland, Fuji, McIntosh. I realize this is an oversimplified example, but hopefully you get the point. I have found that including related words and terms not only helps with context, but helps with SEO.

Knowing about User Intent can further refine your content by dialing in the content to be about common search patterns like “How To Make Apple Pie”, “Where Is The Nearest Apple Store”, “How To Configure Apple TV”, and “Apple iPhone Review.”

When creating content for a “how to” query, structure the content so the H1 tag contains the query, and then put each step of the process in a H2 tag.

Aligning Content Context and Intent is part science, and part art. With time and practice you will get better at it, and naturally start thinking about content creation and content marketing strategies in a way that is optimized to perform better.

Category : BriteWire Blog


Cloud Computing: AWS vs. Google vs. Red Hat

A high level look at the cloud computing services and solutions offered by Amazon, Google, and Red Hat.

Cloud Computing: AWS vs. Google vs. Red HatFor the purposes of this article, I want to map out what I feel are the 3 best Cloud Computing providers: Amazon AWS, Google Cloud, and Red Hat.

Cloud Computing is a broad topic but can be categorized in to 3 main areas: Public Clouds, Private Clouds, and Hybrid Clouds.

Public Cloud computing gets most of the attention, and Amazon Web Services(AWS) has the most market share in this space. Amazon had first mover advantage and AWS has the lead both in terms of market share, but also in the number of Public Cloud services available.

In terms of reported market share based on revenue, Microsoft has the second most with their Azure cloud services, but many feel that is misleading since a large portion of what Microsoft includes in their “commercial cloud business” includes not only Azure, but also software-as-a-service(Saas) solutions like Office 365, Dynamics 365, and other segments of the Productivity and Business Processes Division.  If a company is a “Microsoft Shop” and has invested heavily in Microsoft’s technology then Microsoft Azure is the obvious choice… so I am not including it in this comparison.  They same can be said for cloud services from other companies like Oracle, and IBM. If you are invested heavily in their technology, then their cloud solutions may be the best fit for your organization.

According to the State of the Cloud report by Rightscale, Private Cloud and Hybrid Cloud strategies continue to grow. According to the report enterprises that combine public and private clouds grew to 58 percent in 2019 from 51 percent in 2018. Companies run a majority of workloads in cloud, according to Rightscale. Respondents overall run 38 percent of workloads in Public Cloud and 41 percent in Private Cloud. Respondents to the survey are already running applications in a combination of 3.4 public and private clouds and experimenting with 1.5 more for a total of 4.9 clouds. Among those using any public cloud, respondents are currently using 2.0 public clouds and experimenting with 1.8 more. Among those using any private cloud, respondents are currently using 2.7 private clouds and experimenting with 2.0 more.

In my opinion, Red Hat has the most viable solutions for Private Cloud, and Hybrid Cloud and so I am including Red Hat in this comparison.

AWS vs. Google vs. Red Hat Cloud Services Map

A quick way to understand what the various cloud services providers offer is to map similar functionality in a table. The following table provides a high-level mapping of the services provided by AWS, Google, and Red Hat platforms.

 

 

SERVICE AMAZON GOOGLE RED HAT
Cloud Platform AWS Google Cloud Platform OpenStack
IaaS Elastic Cloud Compute Compute Engine Cloud Infrastructure
PaaS Elastic Beanstalk App Engine OpenShift
FaaS(Serverless) AWS Lambda Cloud Functions OpenWhisk
Hybrid/Private Outposts Anthos OpenStack

Hybrid Cloud & Private Cloud Computing

In cloud computing, hybrid cloud refers to the use of both on-premises resources in addition to public cloud resources. A hybrid cloud enables an organization to migrate applications and data to the cloud, extend their datacenter capacity, utilize new cloud-native capabilities, move applications closer to customers, and create a backup and disaster recovery solution with cost-effective high availability.

Private cloud refers to a model of cloud computing where IT services are provisioned over private IT infrastructure for the dedicated use of a single organization. A private cloud is usually managed via internal resources. Private Cloud and Virtual Private Cloud (VPC) are often used interchangeably. Technically speaking, a VPC is a private cloud using a third-party cloud provider’s infrastructure, while a private cloud is implemented over internal infrastructure.

Amazon and Google are now both investing heavily in Hybrid and Private Cloud solutions, but these solutions are afterthoughts and are not elegantly architected into their core platform. Amazon Outposts is a fully managed service from AWS where customers get AWS configured hardware and software delivered to their on-premise data center or co-location space to run applications in a cloud-native manner without having to run it at AWS data centers. Google Anthos lets you build and manage modern hybrid applications on existing on-premises investments or in the public cloud. Built on open source technologies pioneered by Google—including Kubernetes, Istio, and Knative—Anthos enables consistency between on-premises and cloud environments, but not nearly as consistent as Red Hat’s solutions.

Red Hat has the broadest supported solution with the OpenStack Platform. OpenStack is a set of free and open source software tools for building and managing cloud computing platforms for public and private clouds. Backed by some of the biggest companies in software development and hosting, as well as thousands of individual community members, many think that OpenStack is the future of cloud computing. OpenStack is managed by the OpenStack Foundation, a non-profit that oversees both development and community-building around the project.  OpenStack allows companies to utilize their existing investment in hardware to provision hybrid or private cloud solutions.  With AWS and Google, users are locked into those company’s ecosystem. Unlike AWS or Google’s solutions, OpenStack allows companies numerous options for which Cloud Provider to use.

And The Winner Is…

Like most things with technology, there is not a clear winner with Cloud Computing. It all depends on what problem you are trying to solve, and what your strategy is.

I am an advocate of Open Source solutions, and try to avoid vendor lock-in and proprietary solutions when possible.  The battle for Cloud Computing is actually more about API’s… and using those API’s ultimately means utilizing proprietary interfaces and microservices… which leads to some degree of vendor lock-in.  I will most likely write more about that in the future.

Google Data CentersThat being said, I think there are some clear leaders in various segments. For Infrastructure as a Service(IaaS) Google and Amazon have the largest most robust global assets and network. Amazon has the most global data centers but I lean toward Google’s data centers and their private global fiber network. Both Amazon and Google are best in class for infrastructure if you are just looking to spin up virtual machines and run servers in the cloud.

Moving up a level of abstraction gets us into the Platform as a Service(PaaS) space.  The goal of these solutions is to lower the barrier of entry and make developing applications possible to the the largest possible community of developers, researchers, and businesses.  The focus is on the developer for these solutions, reducing the need for skills in system administrators, ops, and security. These solutions scale your application by automatically starting new machine instances and deploying your application to the new instance. All your instances are automatically running behind a load balancer.

The PaaS solutions provide excellent integration with other services using simple API calls. It can be argued that the race for market dominance among cloud providers is actually all about which vendor has the best and broadest API’s.  Google has an advantage in this space with simple API’s to services including Google Accounts for login, Gmail, Google Maps, and more. Additionally, Google also has a broad array of Artificial Intelligence and Machine Learning services although AI as a service is available from many cloud providers now.

Moving up yet another layer of abstraction takes us to the Functions as a Service(FaaS) or serverless space. FaaS is the concept of serverless computing via serverless architectures. Software developers can leverage this to deploy an individual “function”, action, or piece of business logic. They are expected to start within milliseconds and process individual requests and then the process ends.

Just like PaaS, FaaS provides the ability to easily deploy an application and scale it, without having to provision or configure servers.  Function based apps can be used to replace microservice style architectures and background type services.  Serverless computing allows businesses to run compute-intensive functions on-demand with the near-unlimited scale of the cloud providers, and pay only for the time that code is actually running.

A lot of innovation is still going on in serverless computing and things are rapidly improving and changing.  Amazon uses AWS Lambda, Google uses Cloud Functions, and RedHat appears to be coalescing around Apache OpenWhisk.  A dominate solution or platform has not emerged yet.

Not every language is available on every platform for writing functions. JavaScript (Node.js) is really the only universally supported language and is the most commonly used in examples within documentation across all providers.

LANGUAGE AWS Lambda GCP Functions Apache OpenWhisk
JavaScript(node.js) Yes Yes Yes
Java Yes No Yes (Partial)
C# Yes No No
Python Yes No Yes
PHP No No Yes
Go Yes (Partial) No No
F# No No No
Swift No No Yes

In terms of maturity, efficiency, language support, and ecosystem integration, AWS Lambda has the lead. However, if a Private or Hybrid Cloud and you desire Open Source solutions, then Apache OpenWhisk is the leader.

If your organization needs a Private Cloud, or a Hybrid Cloud and wants to avoid vendor locking I think their is a very clear winner. Red Hat uses Open Stack as their native platform for cloud services, and also uses it for their hybrid and private cloud solutions. Additionally, because OpenStack is open source software, you can run it on Amazon AWS and on Google Cloud.  The same applies for OpenShift, and OpenWhisk. The ability to choose from numerous cloud vendors along with having the same solution for private and hybrid cloud scenarios makes OpenStack, OpenShift, and OpenWhisk the ideal technologies to build your cloud computing solutions on if you are considering a Hybrid Cloud or Private Cloud.

Category : BriteWire Blog