Using Data Analysis To Design Better Photos

Better cameras don’t necessarily take photos that people think are better.

The visual image criteria that customers respond favorably to is an ongoing area of interest to me. I routinely use knowledge gained through A/B testing of images to achieve competitive advantage by producing images specifically designed to appeal to specific users.

In a previous article I discussed how Netflix uses image analysis to create visual content.

This data analysis reveals trends in color preferences and style for images used at Netflix.

Cameras get better each year with sensors capable of capturing more pixels, and more dynamic range. The result are images that are sharp, and have more details in shadows and highlights than ever before.

Advances in Computational Photography allow cell phone cameras to automatically take “better photos” without extra effort and knowledge of photography. Throwing processing power at raw images lets smartphones and cameras do some amazing things. Computational Photography tweaks settings between each shot ensuring that people and scenes always “look their best”.

At least that is the claim.

The problem is that better cameras don’t necessarily take photos that the majority of people prefer because they don’t take trends in color preference and style into consideration.

Marques Brownlee has a popular channel on Youtube where he reviews tech products. He created a video called “The Blind Smartphone Camera Test 2018!” which revealed that “better cameras” are not capturing and processing photos that the vast majority of people prefer to the images captured by less capable cameras.

Using Data Analysis To Design Better Photos
O = iPhone X
P = Xiaomi Pocophone F1
Image Credit: Marques Brownlee
To get as large of an audience as possible, the blind tests Marques used were performed on images posted and viewed on Twitter and Instagram and resulted in over 6 million votes in the polls for best images.

He used 16 of the most popular mobile phones in a winner takes all bracket style competition. He included powerful phones like the Pixel 3, Hydrogen, and the iPhone X, and iPhone XS along with less capable cameras used by the Blackberry and Xiaomi Pocophone.

Most images are viewed on social media platforms now, and it is well known that Twitter and Instagram have horrible image compression, but it makes sense to use these platforms to test trends and image preferences.

The results revealed an obvious trend. People do not like the perfectly exposed high dynamic range photos that the “best cameras” captured.

There’s obviously a massive subjective component: people like brighter, warmer, punchy photos.

Computation photography and large image sensors capable of capturing copious amounts of detail and high dynamic range are still not a substitute for good glass, and an artistic understanding of photographic principals and composition.

In the end, you have to understand what your customers prefer, and craft visual strategies designed to appeal to those preferences.

Category : BriteWire Blog

Google shutting down Google+ following massive undisclosed user data exposure

PR teams at Google and Facebook have been working overtime this week due to breaches of user data on their networks. Google announced that it would shut down Google+ after it discovered a security vulnerability that exposed the private data of up to 500,000 users. Google+ is the company’s long-struggling answer to Facebook’s giant social network.

Google Shutting Down Google PlusIt really is not that much of a blow for Google to shutter Google+. Google admits that Google+ has “low usage and engagement” and that 90 percent of Google+ user sessions last less than five seconds.

It is Google’s lack of disclosure on the security breach that is causing waves in the cybersecurity community. There are rules in California and Europe that govern when a company must disclose a security episode.

Google did not tell its users about the security issue when it was found in March because it didn’t appear that anyone had gained access to user information, and the company’s “Privacy & Data Protection Office” decided it was not legally required to report it, the search giant said in a blog post.

Others are citing the leak as further evidence that the large technology platforms need more regulatory oversight.

“Monopolistic internet platforms like Google and Facebook are probably ‘too big to secure’ and are certainly ‘too big to trust’ blindly,” said Jeff Hauser, from the Centre for Economic and Policy Research.

Google+ had some innovative ideas, that just never caught on. At one time Google put significant effort into pushing the adoption of Google+, including using its data to personalize search results based on what a user’s connections have +1’d.

Thankfully I never invested heavily into Google+, but I did like how you could organize content into collections wich I alligned with market segments.

Google+ will shut down over a 10-month period, which is slated for completion by the end of August, 2019.

Google also announced a series of reforms to its privacy policies designed to give users more control on the amount of data they share with third-party app developers.

Users will now be able to have more “fine grained” control over the various aspects of their Google accounts that they grant to third-parties (ie calendar entries v Gmail), and Google will further limit third-parties’ access to email, SMS, contacts and phone logs.

DuckDuckGo Traffic Growth

The growth of DuckDuckGo is amazing. It took seven years for DuckDuckGo to reach 10 million searches in one day. It then took another two years to hit 20 million searches in a day. It took less than a year after that for DuckDuckGo to surpass 30 million searches in a day!

DuckDuckGo Traffic GrowthIf you are not familiar with the search engine DuckDuckGo yet, you will probably be hearing more about it from now on. DuckDuckGo is a search engine, like Google, Yahoo or Bing, but with DuckDuckGo your searches and your IP address are kept 100% anonymous. People seeking ways to reduce their digital footprint online is driving the rapid traffic growth at

Compared to Google, DuckDuckGo is still tiny. At the time of this blog post, the record for daily searches at DuckDuckGo was 30,602,556. That makes it less than 1% the size of Google which handles over 3.5 billion searches per day.

However, the daily search volume on DuckDuckGo is now approximately one quarter of the daily search volume of Bing. Not bad for a small search startup with a funny name and only 40 employees.

Google continues to dwarf the search volume of the competition like DuckDuckGo, but continuing issues of censorship and data security issues are starting to cause many users to question whether they want to use Google.

This week Google admitted they were hacked and exposed the private data for as many as 500,000 people. As a result, they announced they will be shutting down Google+.

People are increasingly becoming aware that giant technology companies like Google and Facebook are making money off of their private information online. As more competitors to these established titans of technology begin educating users about how they are being tracked, alternatives like DuckDuckGo are gaining market share.

Artificial Neural Networks For Marketing

Artificial Neural Networks are progressive learning systems modeled after the human brain that continuously improve their function over time. Artificial neural networks can be effective in gathering and extracting the relevant information from big data, identify valuable trends, relationships and connections between the data, and then rely on the past outcomes and behaviors to help identify and implement the best marketing tactics and strategies.

Artificial Neural Networks For MarketingThe human brain consists of 100 billion cells called neurons. The neurons are connected together by synapses. If sufficient synaptic inputs to a neuron fire, that neuron will also fire. This process is called “thinking”.

Artificial neural networks are a form of computer program modeled after the way the human brain and nervous system works. It is not necessary to model the biological complexity of the human brain at a molecular level, just its higher level rules.

Common practical uses of Neural Networks are for character recognition, image classification, speech recognition, facial recognition, etc. They are also used for Predictive Analytics. Read our article on HOW LEADING TECHNOLOGY COMPANIES ARE USING ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING.

Artificial Neural Networks(ANN) are a series of interconnected artificial processing neurons functioning in unison to achieve desirable outcomes.

Artificial neurons are elementary units in an artificial neural network. Each artificial neuron receives one or more inputs and sums them to produce an output. Think of artificial neurons as simple storage containers.

Artificial Neural Networks are comprised of three main parts: the input layer, the hidden layer, and the output layer. Note that you can have n hidden layers. The term “deep” learning implies multiple hidden layers. Each layer is a one dimensional array.

During processing each input is separately weighted, and the sum is passed through a non-linear function known as an activation function or transfer function to the next set of artificial neurons.

Using trial and error learning methods neural networks detect patterns existing within a data set ignoring data that is not significant, while emphasizing the data which is most influential.

From a marketing perspective, neural networks are a form of software tool used to assist in decision making. Neural networks are effective in gathering and extracting information from large data sources and identify the cause and effect within data. These neural nets through the process of learning, identify relationships and connections between databases. Once knowledge has been accumulated, neural networks can be relied on to provide generalizations and can apply past knowledge and learning to a variety of situations.

Neural networks help fulfill the role of marketing companies through effectively aiding in market segmentation and measurement of performance while reducing costs and improving accuracy. Due to their learning ability, flexibility, adaption and knowledge discovery, neural networks offer many advantages over traditional models. Neural networks can be used to assist in pattern classification, forecasting and marketing analysis.

How Leading Technology Companies Are Using Artificial Intelligence And Machine Learning

Artificial Intelligence looks for patterns, learns from experience, and predicts responses based on historical data. Artificial Intelligence is able to learn new things at incredible speeds. Artificial Intelligence can be used to accurately predict your behavior and preempt your requests.

Artificial Intelligence And Machine LearningArtificial Intelligence and Machine Learning are shaping many of the products and services you interact with every day. In future blog posts I will be discussing how Artificial Intelligence, Machine Learning, Neural Networks, and Predictive Analytics are being used by Marketers to achieve competitive advantage.

AI’s (Artificial Intelligence) ability to simulate human thinking means it can streamline our lives. It can preempt our needs and requests, making products and services more user friendly as machines learn our needs and figure out how to serve us better.

Here are how some of the top companies are using Artificial Intelligence.


Google is investing heavily in Artificial Intelligence and Machine Learning. Google acquired the AI company DeepMind for the energy consumption, digital health and general purpose Artificial Intelligence programs. It is integrating it into many of its products and services. They are primarily using TensorFlow – an open source software library for high performance numerical computation. They are using Artificial Intelligence and pattern recognition to improve their core search services. Google is also using AI and machine learning for their facial recognition services, and for natural language processing to power their real-time language translation. Google Assistant uses Artificial Antelligence, as does the Google Home series of smart home products, like the Nest thermostat. Google is using a TensorFlow model in Gmail to understand the context of an email and predict likely replies. They call this feature “Smart Replies.” After acquiring more than 50 AI startups in 2015-16, this seems like only the beginning for Google’s AI agenda. You can learn more about Google’s AI projects here:


Amazon has been investing heavily in Artificial Intelligence for over 20 years. Amazon’s approach to AI is called a “flywheel”. At Amazon, the flywheel approach keeps AI innovation humming along and encourages energy and knowledge to spread to other areas of the company. Amazon’s flywheel approach means that innovation around machine learning in one area of the company fuels the efforts of other teams. Artificial Intelligence and Machine learning (ML) algorithms drive many of their internal systems. Artificial Intelligence is also core to their customer experience – from’s recommendations engine that is used to analyze and predict your shopping patterns, to Echo powered by Alexa, and the path optimization in their fulfillment centers. Amazon’s mission is to share their Artificial Intellgience and Machine Learning capabilities as fully managed services, and put them into the hands of every developer and data scientist on Amazon Web Services(AWS). Learn more about Amazon Artificial Intelligence and Machine Learning.


Facebook has come under fire for their widespread use of Artificial Intelligence analytics to target users for marketing and messaging purposes, but they remain committed to advancing the field of machine intelligence and are creating new technologies to give people better ways to communicate. They have also come under fire for not doing enough to moderate content on their platform. Billions of text posts, photos, and videos are uploaded to Facebook every day. It is impossible for human moderators to comprehensively sift through that much content. Facebook uses artificial intelligence to suggest photo tags, populate your newsfeed, and detect bots and fake users. A new system, codenamed “Rosetta,” helps teams at Facebook and Instagram identify text within images to better understand what their subject is and more easily classify them for search or to flag abusive content. Facebook’s Rosetta system scans over a billion images and video frames daily across multiple languages in real time. Learn more about Facebook AI Research. Facebook also has several Open Source Tools For Advancing The World’s AI.


Microsoft added Research and AI as their fourth silo alongside Office, Windows, and Cloud, with the stated goal of making broad-spectrum AI application more accessible and everyday machines more intelligent. Microsoft is integrating Artificial Intelligence into a broad range of Microsoft products and services. Cortana is powered by machine learning, allowing the virtual assistant to build insight and expertise over time. AI in Office 365 helps users expand their creativity, connect to relevant information, and surface new insights. Microsoft Dynamics 365 business applications that use Artificial Intelligence and Machine Learning to analyze data to improve your business processes and deliver predictive analytics. Bing is using advances in Artificial Intelligence to make it even easier to find what you’re looking for. Microsoft’s Azure Cloud Computing Services has a wide portfolio of AI productivity tools and services. Microsoft’s Machine Learning Studio is a powerfully simple browser-based, visual drag-and-drop authoring environment where no coding is necessary.


Apple is the most tight-lipped among top technology companies about their AI research. Siri was one of the first widely used examples of Artificial Intelligence used by consumers. Apple had a head start, but appears to have fallen behind their competitors. Apple’s Artificial Intelligence strategy continues to be focused on running workloads locally on devices, rather than running them on cloud-based resources like Google, Amazon, and Microsoft do. This is consistent with Apple’s stance on respecting User Privacy. Apple believes their approach has some advantages. They have a framework called Create ML that app makers can use to train AI models on Macs. Create ML is the Machine Learning framework used across Apple products, including Siri, Camera, and QuickType. Apple has also added Artificial Intelligence and Machine Learning to its Core ML software allowing developers to easily incorporate AI models into apps for iPhones and other Apple devices. It remains to be seen if Apple can get developers using the Create ML technology, but given the number of Apple devices consumers have, I expect they will get some traction with it.

These are just a few examples of how leading technology companies are using artificial intelligence to improve the products and services we use everyday.

Facebook Usage Declines

42% of Facebook users say they have taken a break from checking the platform for a period of several weeks or more. 26% say they have deleted the Facebook app from their cellphone.

42 Percent of Facebook Users Taking A BreakAccording to Pew Research Facebook users continue to reduce the amount of time they are spending on the platform. Just over half of Facebook users ages 18 and older (54%) say they have adjusted their privacy settings in the past 12 months, according to a new Pew Research Center survey. Around four-in-ten (42%) say they have taken a break from checking the platform for a period of several weeks or more, while around a quarter (26%) say they have deleted the Facebook app from their cellphone. All told, some 74% of Facebook users say they have taken at least one of these three actions in the past year.

The findings come from a Pew Research survey of U.S. adults conducted May 29, 2018 through June 11, 2018.

Younger Facebook Users Adjusting Privacy SettingsThere are, however, age differences in the share of Facebook users who have recently taken some of these actions. Most notably, 44% of younger users (those ages 18 to 29) say they have deleted the Facebook app from their phone in the past year, nearly four times the share of users ages 65 and older (12%) who have done so. Similarly, older users are much less likely to say they have adjusted their Facebook privacy settings in the past 12 months: Only a third of Facebook users 65 and older have done this, compared with 64% of younger users. In earlier research, Pew Research Center has found that a larger share of younger than older adults use Facebook. Still, similar shares of older and younger users have taken a break from Facebook for a period of several weeks or more.

42 percent of the audience not using the platform should translate to fewer daily active users. More than half of the audience changing the privacy settings should mean less opportunity for accurate ad targeting, and lower efficiency of advertising on Facebook.

Full Article at Pew Research.

Latent Semantic Indexing – Does It Help SEO?

Latent Semantic Indexing (LSI) and Latent Semantic Analysis (LSA) are techniques in natural language processing to analyze relationships between a set of documents and the terms they contain by producing a set of concepts related to the documents and terms.

Latent Semantic Indexing - Does It Help SEO?

I thought it might be helpful to explore the concept of Latent Semantic Indexing and its impact on Search Engine Optimization(SEO). If you research Search Enginge Optimization Techniques you are likely to come across some articles on Latent Semantic Indexing (LSI) or Latent Semantic Analysis (LSA). There is some debate on the effectiveness of creating content designed to appeal to LSI algorithms to improve organic search results placement. My position is that by understanding LSI / LSA you are able to create better content, regardless of whether it improves organic search results on Google, Bing, etc.

What Is Latent Semantic Indexing?

In simple terms, Latent Semantic Indexing allows a search engine to include pages with the text “Home For Sale” in the search results for the search term “House For Sale.” That is because “Home” and “House” are related terms. LSI is a technique that reveals underlying concepts of content by means of synonyms and related words. It is basically using data correlation to find related terms. (see article: A Primer On Data Correlation For Marketers). Latent Semantic Indexing finds hidden (latent) relationships between words (semantics) in order to improve information understanding (indexing).

Latent Semantic Indexing As SEO Strategy

It has been suggested that using synonyms and related terms in optimized website content could help a search engine better understand the content and rank it higher in search results. The idea is to take your target keywords or search term and indentify a list of “LSI Keywords” to include in the content.

Search engines attempt to understand the context of any piece of content they index. The field of semantics (the study of meaning in language) is a fundamental part of this approach.

While it is believed that search engines relied heavily on LSI in the early stages of their development, there is no evidence that the Google and Bing rely heavily on it now.

Why Condisder LSI As Part Of Your Content Strategy?

My opinion is that you should focus on creating good quality content on website vs. trying to game Google or Bing with optimization techniques. As part of my content strategies, I use a basic understand of Latent Semantic Analysis to create better content. Different people may use different terms to describe the same topic, and by including these related terms it can make the content more relevent to the reader. I also use LSA to make sure I am not overusing the primary keywords or search term. I use LSA to come up with derivative articles that are related to the primary article. An effective SEO strategy should also include relevant backlinks, relevant alt tags, etc.

In summary, I am not convinced that using Latent Semantic Analysis will improve your search rankings, but I feel it can improve your content.

Video Resolution and Image Megapixels Requirements

Understanding image output requirements ensures you select the right equipment for capturing video and photographic content, and helps properly estimate storage and post production resources needed to deliver a quality product.

I had a project recently(early 2018) where I needed to deliver numerous videos and photographic images for marketing purposes. The images needed to be suitable for both the Internet, and for printing.

I was asked to evaluate the current state of the market for 4K Video Production, and current trends for digital marketing.

I needed to estimate storage needs(cloud based & local), and determine what equipment to purchase to capture the video and still images, and for post production editing.

I am turning my research on the subject into a blog post for future reference.

Starting with video, I researched resolution of current video standards. The first graphic shows the relative size difference of various video formats.

Video Resolution Megapixels

If you’re planning your next shoot or looking into a video marketing investment, you’ve probably considered 4K or higher. Over the past few years, 4K video has gained quite a bit of popularity.

Youtube started supporting 4K at 60fps video uploads in 2013, and 8K video uploads in 2015. Although understandably small as a proportion of the whole YouTube ingest profile, these formats are still significant and we noticed that the take-up is accelerating.

We were hoping to standardize on 4K video until we discovered that 1 minute of 4K video uncompressed is 32GB in size. 20 minutes of video filmed at 1080, using the Apple ProRes 422 HQ codec, will take about 20 GB of storage space. Using the same codec, 20 minutes of 4K video will take about 200 GB of space.

I would be filming several interviews plus lots of B-roll which would result in several terabytes of raw video files.

Since none of our existing video cameras and drones could capture 4K video, this meant purchasing all new video equipment. After estimating the cost to capture and edit 4K video we elected to go with Full HD(1920 x 1080) for now(early 2018).

I know I will be moving to 4K video in the near future.

I also had a requirement that our photographs were to be used in print campaigns. Some material would be printed at 300dpi, while other material required 600dpi.

I had to ensure that our images were being captured and stored at high enough resolutions to support printing at 600dpi without upscaling or pixel interpolation.

The following table illustrates the pixel resolution we need to support different size prints at different dot per inch targets.

Megapixels Pixel Resolution* Print Size @ 600ppi Print size @ 300ppi Print size @ 150ppi**
3 2048 x 1536 3.41″ x 2.56″ 6.82″ x 5.12″ 13.65″ x 10.24″
4 2464 x 1632 4.12″ x 2.72″ 8.21″ x 5.44″ 16.42″ x 10.88″
6 3008 x 2000 5.01″ x 3.34″ 10.02″ x 6.67″ 20.05″ x 13.34″
8 3264 x 2448 5.44″ x 4.08″ 10.88″ x 8.16″ 21.76″ x 16.32″
10 3872 x 2592 6.46″ x 4.32″ 12.91″ x 8.64″ 25.81″ x 17.28″
12 4290 x 2800 7.15″ x 4.67″ 14.30″ x 9.34″ 28.60″ x 18.67″
16 4920 x 3264 8.20″ x 5.44″ 16.40″ x 10.88″ 32.80″ x 21.76″
20(35mm film, scanned) 5380 x 3620 8.97″ x 6.03″ 17.93″ x 12.06″ 35.87″ x 24.13″
24  6016 x 4016  10.03″ x 6.69″  20.05″ x 13.39″   40.01″ x 26.78″
36 7360 x 4912 12.27″ x 8.19″ 24.53″ x 16.37″ 49.06″ x 32.74″

*Typical Resolution. Actual pixel dimensions vary from camera to camera.

**At 150ppi, printed images will have visible pixels and details will look “fuzzy”.

Each colored box represents a certain number of megapixels. The numbers along the top and left side are print dimensions in inches at 300ppi (pixels per inch). Most books and magazines require 300ppi for photo quality. For example, the chart shows that you can make a 5″ x 7″ photo quality print from a 3 megapixel camera.

inches @ 300ppi (numbers inside colored boxes are megapixels)

Print Megapixels Required

Notice that as the print size doubles, the megapixels required increases geometrically. You can make nice 8″ x 10″ prints with a 6 or 8 megapixel camera, but to make a true photo quality 16″ x 20″ print, you need between 24 and 30 megapixels. Don’t be fooled by manufacturers’ claims that say you can make 16″ x 20″ prints from an 8 megapixel camera. While you certainly can make a print that size, it will not be true photo quality.

Having gone through this exercise I was able to estimate what new equipment we would need, set some standards for capturing videos and images, and procure the required resources to edit and store the digital assets.

Category : BriteWire Blog

How ‘Facebook Zero’ Will Impact Social Media Marketing

Facebook’s latest algorithm update(Facebook Zero) will have a significant impact on Social Media Marketing efforts on the platform. In this article, you’ll find out what to expect from the changes and learn how you can best maintain interaction and visibility with audiences on the Facebook news feed.

Facebook Page Organic ReachFacebook is making significant changes to its news feed algorithm in an effort to prioritize “meaningful” person-to-person interactions among friends and family over posts from Facebook Pages. Facebook has acknowledged that passively consuming articles or videos that don’t spark engagement or interaction is bad for a person’s mood.

While Facebook still values FB Page content as an important part of their platform’s ecosystem, the news feed will shift the focus from ranking content that’s directly consumed from Pages (which will shrink in reach) to content that is shared and talked about among friends (which will grow).

Reducing or eliminating this may be a positive for their user’s experience, but these updates will result in fewer public posts from FB Pages and fewer videos in user’s news feed unless companies and organizations pay to promote that content.

This will continue to cause a decline in referral traffic from Facebook(which is already down over 25%), a topic I discussed in a previous blog post titled: Referral Traffic Trends: Facebook Declines, Google Grows.

Mark Zuckerberg, said this about the planned updates to the news feed:

“As we roll this out, you’ll see less public content like posts from businesses, brands, and media. And the public content you see more will be held to the same standard – it should encourage meaningful interactions between people.”

Organic reach for business pages has been declining for a long time. The Facebook Zero algorithm update just accelerates the decline. The chart on this article clearly shows the decrease in organic reach for page posts since 2012.

Facebook is a for-profit company. As such they are financially motivated to slowly decrease the efficacy of their organic (i.e. FREE) advertising in order to make people move to their paid ad platform.

Google did the same thing in their “search engine results” which drove revenue growth for their AdWords Paid Search Advertising service. In a previous blog post I discussed how this ultimately caused Google to lose market share for product searches on the Internet. Amazon now leads Google in Product Searches.

It is why I(along with the European Union) no longer call Google a search engine. I refer to it as a Paid Placement Advertising Service.


If you rely primarily on organic traffic from Facebook, then the Facebook Zero algorithm is bad news.

As Facebook begins to prioritize content that facilitates meaningful people-to-people connections and interactions, marketers can expect the organic reach of their Facebook pages to drop.

Specifically, users will see more posts from people they’re connected to in the news feed and less content from pages they follow. This is a significant change from the previous values of the news feed, which prioritized the total time spent on Facebook and how many people shared posts directly.

Users will also start seeing less video content in the news feed because it typically generates less conversation, particularly public videos.

Facebook will allow your followers the option to select to see your content first in their News Feed, an option that very few if any users will be able to figure out how to set. It also hints that Groups and Live Video may get a boost with the new update.


Facebook has stated that Posts that spark and inspire conversations and meaningful interactions among people will be prioritized over public content in the news feed after the update.

Content that is shared and commented on will receive a higher prioritization. Comments are more valuable than Likes. Comments on Content that take time and thought to type out will be a positive ranking signal for the Content.


Instead of posting everything, businesses and organizations will have to be more strategic about the content they post. They need to create posts that promote meaningful connections and interactions among people. Those will be prioritized in the news feed.

It is also wise for businesses and organizations to utilize additional platforms and strategies for getting their content in front of their target audience. If your business doesn’t have an audience on other platforms… now is the time to build those mechanisms.

I will talk more about that in future posts.

Until then, you may be interested in these related posts: Social Media Content Half Life, Big Data & Psychometric Marketing, Marketing Uplift & Predictive Modeling, and Targeted Marketing Models.

Category : BriteWire Blog

Referral Traffic Trends: Facebook Declines, Google Grows

In the back and forth battle for who reigns supreme for referral traffic, Google has once again taken over the lead from Facebook.

Referral Traffic Trends: Facebook Declines, Google Grows

In July 2015 Facebook pulled ahead of Google as the top source of referral traffic(source Facebook’s share of referral traffic grew to 38.3 percent vs. Google’s 35.8 percent.

However beginning in February of 2017 websites began experiencing a significant and steady decline of referral traffic from Facebook, while refferal traffic from Google begin to slightly increase.

After six months of steadily declining numbers, Facebook referral traffic is now down over 25%.

Referral Traffic Trends: Facebook Declines, Google Grows

It is important to point out that the decrease in referral traffic from Facebook has not affected all publishers equally. Some have seen huge drops in traffic, while others have seen an increase in the same period.

Two-thirds of the publishers saw a decrease in referral traffic over this period while the remaining one-third saw an increase. For half, traffic decreased a substantial 20% or more, and for 20% of publishers, Facebook referral trafic was cut by 50% or more.

Facebook continues to tweak the algorithms it uses to govern what posts appear in their News Feed. It appears that Facebook is downgrading the chances of content from Businesses and Organizations from appearing in News Feeds… perhaps in an attempt to create more demand for its advertising business.

This topic is discussed(and confirmed) in a BriteWire blog post titled: How ‘Facebook Zero’ Will Impact Social Media Marketing.

Category : BriteWire Blog