Facebook Usage Declines

42% of Facebook users say they have taken a break from checking the platform for a period of several weeks or more. 26% say they have deleted the Facebook app from their cellphone.

42 Percent of Facebook Users Taking A BreakAccording to Pew Research Facebook users continue to reduce the amount of time they are spending on the platform. Just over half of Facebook users ages 18 and older (54%) say they have adjusted their privacy settings in the past 12 months, according to a new Pew Research Center survey. Around four-in-ten (42%) say they have taken a break from checking the platform for a period of several weeks or more, while around a quarter (26%) say they have deleted the Facebook app from their cellphone. All told, some 74% of Facebook users say they have taken at least one of these three actions in the past year.

The findings come from a Pew Research survey of U.S. adults conducted May 29, 2018 through June 11, 2018.

Younger Facebook Users Adjusting Privacy SettingsThere are, however, age differences in the share of Facebook users who have recently taken some of these actions. Most notably, 44% of younger users (those ages 18 to 29) say they have deleted the Facebook app from their phone in the past year, nearly four times the share of users ages 65 and older (12%) who have done so. Similarly, older users are much less likely to say they have adjusted their Facebook privacy settings in the past 12 months: Only a third of Facebook users 65 and older have done this, compared with 64% of younger users. In earlier research, Pew Research Center has found that a larger share of younger than older adults use Facebook. Still, similar shares of older and younger users have taken a break from Facebook for a period of several weeks or more.

42 percent of the audience not using the platform should translate to fewer daily active users. More than half of the audience changing the privacy settings should mean less opportunity for accurate ad targeting, and lower efficiency of advertising on Facebook.

Full Article at Pew Research.


Latent Semantic Indexing – Does It Help SEO?

Latent Semantic Indexing (LSI) and Latent Semantic Analysis (LSA) are techniques in natural language processing to analyze relationships between a set of documents and the terms they contain by producing a set of concepts related to the documents and terms.

Latent Semantic Indexing - Does It Help SEO?

I thought it might be helpful to explore the concept of Latent Semantic Indexing and its impact on Search Engine Optimization(SEO). If you research Search Enginge Optimization Techniques you are likely to come across some articles on Latent Semantic Indexing (LSI) or Latent Semantic Analysis (LSA). There is some debate on the effectiveness of creating content designed to appeal to LSI algorithms to improve organic search results placement. My position is that by understanding LSI / LSA you are able to create better content, regardless of whether it improves organic search results on Google, Bing, etc.

What Is Latent Semantic Indexing?

In simple terms, Latent Semantic Indexing allows a search engine to include pages with the text “Home For Sale” in the search results for the search term “House For Sale.” That is because “Home” and “House” are related terms. LSI is a technique that reveals underlying concepts of content by means of synonyms and related words. It is basically using data correlation to find related terms. (see article: A Primer On Data Correlation For Marketers). Latent Semantic Indexing finds hidden (latent) relationships between words (semantics) in order to improve information understanding (indexing).

Latent Semantic Indexing As SEO Strategy

It has been suggested that using synonyms and related terms in optimized website content could help a search engine better understand the content and rank it higher in search results. The idea is to take your target keywords or search term and indentify a list of “LSI Keywords” to include in the content.

Search engines attempt to understand the context of any piece of content they index. The field of semantics (the study of meaning in language) is a fundamental part of this approach.

While it is believed that search engines relied heavily on LSI in the early stages of their development, there is no evidence that the Google and Bing rely heavily on it now.

Why Condisder LSI As Part Of Your Content Strategy?

My opinion is that you should focus on creating good quality content on website vs. trying to game Google or Bing with optimization techniques. As part of my content strategies, I use a basic understand of Latent Semantic Analysis to create better content. Different people may use different terms to describe the same topic, and by including these related terms it can make the content more relevent to the reader. I also use LSA to make sure I am not overusing the primary keywords or search term. I use LSA to come up with derivative articles that are related to the primary article. An effective SEO strategy should also include relevant backlinks, relevant alt tags, etc.

In summary, I am not convinced that using Latent Semantic Analysis will improve your search rankings, but I feel it can improve your content.


Video Resolution and Image Megapixels Requirements

Understanding image output requirements ensures you select the right equipment for capturing video and photographic content, and helps properly estimate storage and post production resources needed to deliver a quality product.

I had a project recently(early 2018) where I needed to deliver numerous videos and photographic images for marketing purposes. The images needed to be suitable for both the Internet, and for printing.

I was asked to evaluate the current state of the market for 4K Video Production, and current trends for digital marketing.

I needed to estimate storage needs(cloud based & local), and determine what equipment to purchase to capture the video and still images, and for post production editing.

I am turning my research on the subject into a blog post for future reference.

Starting with video, I researched resolution of current video standards. The first graphic shows the relative size difference of various video formats.

Video Resolution Megapixels

If you’re planning your next shoot or looking into a video marketing investment, you’ve probably considered 4K or higher. Over the past few years, 4K video has gained quite a bit of popularity.

Youtube started supporting 4K at 60fps video uploads in 2013, and 8K video uploads in 2015. Although understandably small as a proportion of the whole YouTube ingest profile, these formats are still significant and we noticed that the take-up is accelerating.

We were hoping to standardize on 4K video until we discovered that 1 minute of 4K video uncompressed is 32GB in size. 20 minutes of video filmed at 1080, using the Apple ProRes 422 HQ codec, will take about 20 GB of storage space. Using the same codec, 20 minutes of 4K video will take about 200 GB of space.

I would be filming several interviews plus lots of B-roll which would result in several terabytes of raw video files.

Since none of our existing video cameras and drones could capture 4K video, this meant purchasing all new video equipment. After estimating the cost to capture and edit 4K video we elected to go with Full HD(1920 x 1080) for now(early 2018).

I know I will be moving to 4K video in the near future.

I also had a requirement that our photographs were to be used in print campaigns. Some material would be printed at 300dpi, while other material required 600dpi.

I had to ensure that our images were being captured and stored at high enough resolutions to support printing at 600dpi without upscaling or pixel interpolation.

The following table illustrates the pixel resolution we need to support different size prints at different dot per inch targets.

Megapixels Pixel Resolution* Print Size @ 600ppi Print size @ 300ppi Print size @ 150ppi**
3 2048 x 1536 3.41″ x 2.56″ 6.82″ x 5.12″ 13.65″ x 10.24″
4 2464 x 1632 4.12″ x 2.72″ 8.21″ x 5.44″ 16.42″ x 10.88″
6 3008 x 2000 5.01″ x 3.34″ 10.02″ x 6.67″ 20.05″ x 13.34″
8 3264 x 2448 5.44″ x 4.08″ 10.88″ x 8.16″ 21.76″ x 16.32″
10 3872 x 2592 6.46″ x 4.32″ 12.91″ x 8.64″ 25.81″ x 17.28″
12 4290 x 2800 7.15″ x 4.67″ 14.30″ x 9.34″ 28.60″ x 18.67″
16 4920 x 3264 8.20″ x 5.44″ 16.40″ x 10.88″ 32.80″ x 21.76″
20(35mm film, scanned) 5380 x 3620 8.97″ x 6.03″ 17.93″ x 12.06″ 35.87″ x 24.13″
24  6016 x 4016  10.03″ x 6.69″  20.05″ x 13.39″   40.01″ x 26.78″
36 7360 x 4912 12.27″ x 8.19″ 24.53″ x 16.37″ 49.06″ x 32.74″

*Typical Resolution. Actual pixel dimensions vary from camera to camera.

**At 150ppi, printed images will have visible pixels and details will look “fuzzy”.

Each colored box represents a certain number of megapixels. The numbers along the top and left side are print dimensions in inches at 300ppi (pixels per inch). Most books and magazines require 300ppi for photo quality. For example, the chart shows that you can make a 5″ x 7″ photo quality print from a 3 megapixel camera.

inches @ 300ppi (numbers inside colored boxes are megapixels)

Print Megapixels Required

Notice that as the print size doubles, the megapixels required increases geometrically. You can make nice 8″ x 10″ prints with a 6 or 8 megapixel camera, but to make a true photo quality 16″ x 20″ print, you need between 24 and 30 megapixels. Don’t be fooled by manufacturers’ claims that say you can make 16″ x 20″ prints from an 8 megapixel camera. While you certainly can make a print that size, it will not be true photo quality.

Having gone through this exercise I was able to estimate what new equipment we would need, set some standards for capturing videos and images, and procure the required resources to edit and store the digital assets.

Category : BriteWire Blog


How ‘Facebook Zero’ Will Impact Social Media Marketing

Facebook’s latest algorithm update(Facebook Zero) will have a significant impact on Social Media Marketing efforts on the platform. In this article, you’ll find out what to expect from the changes and learn how you can best maintain interaction and visibility with audiences on the Facebook news feed.

Facebook Page Organic ReachFacebook is making significant changes to its news feed algorithm in an effort to prioritize “meaningful” person-to-person interactions among friends and family over posts from Facebook Pages. Facebook has acknowledged that passively consuming articles or videos that don’t spark engagement or interaction is bad for a person’s mood.

While Facebook still values FB Page content as an important part of their platform’s ecosystem, the news feed will shift the focus from ranking content that’s directly consumed from Pages (which will shrink in reach) to content that is shared and talked about among friends (which will grow).

Reducing or eliminating this may be a positive for their user’s experience, but these updates will result in fewer public posts from FB Pages and fewer videos in user’s news feed unless companies and organizations pay to promote that content.

This will continue to cause a decline in referral traffic from Facebook(which is already down over 25%), a topic I discussed in a previous blog post titled: Referral Traffic Trends: Facebook Declines, Google Grows.

Mark Zuckerberg, said this about the planned updates to the news feed:

“As we roll this out, you’ll see less public content like posts from businesses, brands, and media. And the public content you see more will be held to the same standard – it should encourage meaningful interactions between people.”

Organic reach for business pages has been declining for a long time. The Facebook Zero algorithm update just accelerates the decline. The chart on this article clearly shows the decrease in organic reach for page posts since 2012.

Facebook is a for-profit company. As such they are financially motivated to slowly decrease the efficacy of their organic (i.e. FREE) advertising in order to make people move to their paid ad platform.

Google did the same thing in their “search engine results” which drove revenue growth for their AdWords Paid Search Advertising service. In a previous blog post I discussed how this ultimately caused Google to lose market share for product searches on the Internet. Amazon now leads Google in Product Searches.

It is why I(along with the European Union) no longer call Google a search engine. I refer to it as a Paid Placement Advertising Service.

WHAT DO THESE CHANGES MEAN FOR FACEBOOK MARKETERS?

If you rely primarily on organic traffic from Facebook, then the Facebook Zero algorithm is bad news.

As Facebook begins to prioritize content that facilitates meaningful people-to-people connections and interactions, marketers can expect the organic reach of their Facebook pages to drop.

Specifically, users will see more posts from people they’re connected to in the news feed and less content from pages they follow. This is a significant change from the previous values of the news feed, which prioritized the total time spent on Facebook and how many people shared posts directly.

Users will also start seeing less video content in the news feed because it typically generates less conversation, particularly public videos.

Facebook will allow your followers the option to select to see your content first in their News Feed, an option that very few if any users will be able to figure out how to set. It also hints that Groups and Live Video may get a boost with the new update.

WHAT TYPE OF CONTENT WILL GET PRIORITIZED IN THE FEED?

Facebook has stated that Posts that spark and inspire conversations and meaningful interactions among people will be prioritized over public content in the news feed after the update.

Content that is shared and commented on will receive a higher prioritization. Comments are more valuable than Likes. Comments on Content that take time and thought to type out will be a positive ranking signal for the Content.

HOW CAN BUSINESSSE AND ORGANIZATIONS RESPOND TO THESE CHANGES?

Instead of posting everything, businesses and organizations will have to be more strategic about the content they post. They need to create posts that promote meaningful connections and interactions among people. Those will be prioritized in the news feed.

It is also wise for businesses and organizations to utilize additional platforms and strategies for getting their content in front of their target audience. If your business doesn’t have an audience on other platforms… now is the time to build those mechanisms.

I will talk more about that in future posts.

Until then, you may be interested in these related posts: Social Media Content Half Life, Big Data & Psychometric Marketing, Marketing Uplift & Predictive Modeling, and Targeted Marketing Models.

Category : BriteWire Blog


Referral Traffic Trends: Facebook Declines, Google Grows

In the back and forth battle for who reigns supreme for referral traffic, Google has once again taken over the lead from Facebook.

Referral Traffic Trends: Facebook Declines, Google Grows

In July 2015 Facebook pulled ahead of Google as the top source of referral traffic(source Parse.ly). Facebook’s share of referral traffic grew to 38.3 percent vs. Google’s 35.8 percent.

However beginning in February of 2017 websites began experiencing a significant and steady decline of referral traffic from Facebook, while refferal traffic from Google begin to slightly increase.

After six months of steadily declining numbers, Facebook referral traffic is now down over 25%.

Referral Traffic Trends: Facebook Declines, Google Grows

It is important to point out that the decrease in referral traffic from Facebook has not affected all publishers equally. Some have seen huge drops in traffic, while others have seen an increase in the same period.

Two-thirds of the publishers saw a decrease in referral traffic over this period while the remaining one-third saw an increase. For half, traffic decreased a substantial 20% or more, and for 20% of publishers, Facebook referral trafic was cut by 50% or more.

Facebook continues to tweak the algorithms it uses to govern what posts appear in their News Feed. It appears that Facebook is downgrading the chances of content from Businesses and Organizations from appearing in News Feeds… perhaps in an attempt to create more demand for its advertising business.

This topic is discussed(and confirmed) in a BriteWire blog post titled: How ‘Facebook Zero’ Will Impact Social Media Marketing.

Category : BriteWire Blog


Amazon #1, Google Drops To #2 In Product Searches

Amazon has become the top place consumers go on the Internet to search for products as Google continues to lose market share for e-commerce search.

Google – you blew it!

Amazon #1, Google Drops To #2 In Product SearchesWhen you all but eliminated “products” from your organic search results to create demand for your paid placement “Google Shopping” monetization scheme you ceased to be a Search Engine. Instead, you became a Paid Placement Advertising Service. Your index of products available for purchase was reduced to those stores willing to pay you to inlcude their products in your Google Shopping product index.

Google Shopping is powered by two platforms: AdWords and Google Merchant Center. Google Merchant Center is where stores submit their product data feeds to. AdWords is where stores create Google Shopping Ads to promote the products in the Google Shopping product index. Only products submitted to Google Merchant Center and paid to be promoted via AdWords appear in the Google Shopping product index.

Google’s “pay to play” shopping scheme resulted in a small subset of products represented in their search results, and allowed Amazon to take a leadership position in Product Search. It also resulted in a $2.7BN fine for EU antitrust violations over Google shopping searches.

Fifty-five percent of people in the U.S. now start their online shopping trips on Amazon.com(source: BloomReach 2016 survey). That statistic marks a 25 percent increase from the same survey last year, when 44 percent of online shoppers said they turned to Amazon first. Over the same time, the percentage of shoppers who start product searches on search engines like Google dropped from 34 percent to 28 percent.

Amazon is exploiting Google’s “weakness” in online shopping, as shoppers are increasingly starting their product searches on Amazon instead of Google(source: Forrester Research Report 2017). According to the report, consumers are 2.5 times more likely to find out about the brand of a recent purchase from Amazon versus any other search. “This erosion is likely to continue” for Google, the report said.

According to a survey by the financial services firm Raymond James, more than half of people start their search for online shopping on Amazon now, while only 26% use search engines like Google as the starting point.

Perhaps what’s more concerning is that the search engine’s share has been cut in half compared to 2014, while Amazon’s share has significantly increased over the past two years.

And it doesn’t look like the trend will change any time soon. The prime 18 to 29 year age group prefers Amazon by a wide margin when it comes to online product searches, according to the survey.

Product Search Starting Point

Product Search Preference By Age Group

Category : BriteWire Blog


Targeted Marketing Models

Targeting the best prospective core customers within the markets that have the highest probability of success establishes clear marketing goals that can be accurately measured by customer acquisition and sales conversion.

Targeted Marketing Models
In a previous post I covered Marketing Uplift & Predictive Modeling.

This post will discuss using Predictive Modeling & Marketing Uplift combined with classic Customer Analytics to create more accurate Targeted Marketing Campaigns.

The foundation of any accurate marketing model is a precise definition of your core customer, which encompasses 4 key aspects:

  1. Who your best customers are
  2. Where your best potential customers can be found
  3. What messages those customers respond to & when to send them
  4. The value potential the customers have to you, either in terms of dollars or visits

Customer analytic solutions have traditionally been used to gather some of those customer insights(Data Dimensions).

However, a properly defined Marketing Model combined with Customer Analytics can help not only accurately quantify the benefit for your marketing programs, but build out a deeper definition of your core customer.

Predictive Modeling can be used to understand the potential benefit of a new marketing program before it is launched. It can help you predict if it is a good investment or a bad one.

It can also be used to optimize existing marketing programs by establishing and analyzing key metrics.

The ability to measure, analyze, and assess existing marketing programs can identify which programs are not meeting their potential and then determine the gap between actual performance and ultimate potential.

Correlating this marketing information with known customer data dimensions allows you to refine your Core Customer Profile.

This insight allows you to immediately reprioritize resources for management, re-assign budget and marketing from the initiatives with no upside to the ones that are just underperforming.

Insights from customer analytics and marketing models on who core customers are, where potential core customers are located within an underperforming market area, what messages they respond to and when they respond to them, and the potential value of each of those core customers makes it possible to develop a focused demographic & psychographic profile of customers to target, and a measurable marketing campaign to go after them.

BriteWire sends the right message, to the right customer, at the right time.


Google AMP Is Not A Good Thing

Google’s AMP is bad. Google AMP is bad for how the web is built. Google AMP is bad for publishers. Google AMP is only good for one party: Google.

Google AMP
Do we really need Instant Articles (Facebook) and AMP (Google) when we can accomplish fast loading pages with plain, uncomplicated HTML and CSS?
Web developers can use simple HTML and CSS to make clear, fast loading pages. This does not require complicated tricks or techniques. The pages can still be made to be functional, and have good graphic design.

When AMP (Google’s Accelerated Mobile Pages) first came out, I was optimistic. AMP’s aim was to make the web faster, and I am aligned with that goal.

What I dislike was the fact that Google was caching AMP content, and serving it from their own cache and under their own domain name. In other words, instead of serving the content from my websites, it is being served from Google.com.

This is a clever scheme to “trap” users on a Google owned ecosystem.

To entice developers and publishers to implement AMP, Google let it be known that AMP enabled websites would rank well in Google search results. To be clear, Google has officially stated that AMP support does not affect site’s search ranking… but they made it very clear that they will penalize websites that do not render fast on mobile devices. AMP enabled content will render fast on mobile devices. Therefore developers came to the conclusion that AMP enabled websites would be a good strategy for ranking well in Google’s search results.

Another search ranking benefit of AMP, is that only AMP enabled sites are shown in Google’s carousel feature. Getting featured there must be very important for big publishers.

Google’s AMP is bad. Google AMP is bad for how the web is built. Google AMP is bad for publishers. Google AMP is only good for one party: Google.

Category : BriteWire Blog


Marketing Uplift & Predictive Modeling

Marketing Uplift(aka Marketing Lift) and Predictive Modeling are hot concepts in marketing. In this post we take a quick look at these marketing techniques.

Marketing UpliftMarketing Uplift(aka Marketing Lift) is the difference in response rate between a treated group and a randomized control group.

Marketing Uplift Modeling can be defined as improving (upping) lift through predictive modeling.

A Predictive Model predicts the incremental response to the marketing action. It is a data mining technique that can be applied to engagement and conversion rates.

Uplift Modeling uses both the treated and control customers to build a predictive model that focuses on the incremental response.

Traditional Response Modeling only uses the treated group to build a predictive modeling. Predictive Modeling separates the likely responders from the non-responders.

Traditional Response Modeling segments an audience into the following primary groups:

  • The Persuadables : audience members who only respond to the marketing action because they were targeted
  • The Sure Things : audience members who would have responded whether they were targeted or not
  • The Lost Causes : audience members who will not respond irrespective of whether or not they are targeted
  • The Do Not Disturbs or Sleeping Dogs : audience members who are less likely to respond because they were targeted

The only segment that provides true incremental responses is the Persuadables.

Because uplift modelling focuses on incremental responses only, it provides very strong return on investment cases when applied to traditional demand generation and retention activities. For example, by only targeting the persuadable customers in an outbound marketing campaign, the contact costs and hence the return per unit spend can be dramatically improved.

One of the most effective uses of uplift modelling is in the removal of negative effects from retention campaigns. Both in the telecommunications and financial services industries often retention campaigns can trigger customers to cancel a contract or policy. Uplift modelling allows these customers, the Do Not Disturbs, to be removed from the campaign.


Big Data & Psychometric Marketing

Psychometrics, sometimes also called psychographics, focuses on measuring psychological traits, such as personality. Psychologists developed a model that sought to assess human beings based on five personality traits, known as the “Big Five.” Big data correlated with personality profiles allows for accurate psychographic targeting by marketers.

Psychometric Marketing

Psychometrics is a field of study concerned with the theory and technique of psychological measurement. Psychometric research involves two major tasks. 1. The construction of instruments. 2. The development of procedures for measurement. Practitioners are described as psychometricians. The most common model for expressing an individual’s psychometric personality are the Big Five personality traits.

5 Personality Traits (Big Five)

  • Openness (how open you are to new experiences?)
  • Conscientiousness (how much of a perfectionist are you?)
  • Extroversion (how sociable are you?)
  • Agreeableness (how considerate and cooperative you are?)
  • Neuroticism (are you easily upset?)

Based on these 5 dimensions, also known as OCEAN(Openness, Conscientiousness, Extroversion, Agreeableness, Neuroticism) we can make a relatively accurate assessment of a person. This includes their needs and fears, and how they are likely to behave.

The “Big Five” has become the standard technique of psychometrics. The problem with this approach has been data collection, because it required filling out a lengthy, highly personal questionnaire.

With the Internet, and cell phones this is no longer a problem.

The Internet allows researchers to collect all sorts of online data from users and using Data Correlation they can associate online actions with personality types. Remarkably reliable deductions can be drawn from simple online actions. What users “liked,” shared or posted on Facebook, or what gender, age, place of residence they specified can be directly correlated with personality traits.

While each individual piece of such information is too weak to produce a reliable prediction, when tens, hundreds, or thousands of individual data points are combined the resulting predictions become very accurate. This enables data scientists and marketers to connect the dots and make predictions on people’s demographic and psychographic behavior with astonishing accuracy.

Smartphones are a psychological questionnaire on their users that is constantly being updated, both consciously and unconsciously. It is not uncommon for data analytics companies to have over 4,000 data points for each person in their target data set.

The strength of their psychometric modeling was illustrated by how well it could predict a subject’s answers.

It is now possible to assign Big Five values based purely on how many profile pictures a person has on Facebook, or how many contacts they have (a good indicator of extroversion).

Psychological profiles be created from this data, but the data can also be used the other way round… to search for specific profiles.

This allows for accurate psychographic targeting by marketers in ways that have never before been possible.

BriteWire Intelligent Internet Marketing facilitates building Psychometric User Profiles with purposefully designed content interactions, and automates marketing responses based on those triggers.