SMX Paris I : AI as a service, AI for SEO, Vectors, Clusters and Audiences

Today was Day 1 on SMX Paris, an event I have been involved in for the past 5 years as member of the Advisory board. I spent the day enjoying sessions over a wide spectrum and with high quality speakers. This year is clearly the year of AI and GDPR. AI in all the presentation titles and GDPR in all the Q&A’s. I discovered AI as a service – #AIAS, a bit of AI for SEO, Vector analysis, Clusters and Audiences and then some Javascript indexing and some AMP and PWA.

(Day 2 is here: Penalty recovery, snaploggin, e-privay, chatbos and DSA)

This year’s SMX Paris is in a new venue but some things do follow traditions : they start with a railway strike so that everyone is late, just like the past 3 or 4 editions. We are at Marriot Rive Gauche close to Denfert Rochereau in a very nice set up with a central area and access to three different tracks on the side.

 

 

BADams 

My first stop is on the SEO track as I finally get to meet Barry Adams face to face. I do love your twitter handle, Barry – @badams. A bit of a legend and his first time on SMX Paris, although he has actually lived and worked here many years ago. Barry is here to tell us about SEO and Javascript rendering. As he starts out telling us how Search Engines work, I get worried for a moment – is he process of Crawl, Indexation and Ranking because the Indexation step actually splits in several phases with an initial HTML crawls like always and then the second crawl with a Chrome 41 user agent rending the Javascript version of the page.

Barry makes an excellent case for Server-side rending of Javascript through solutions like Angular or React (preferred) and we get to understand why Client-side Javascript for anything else than stylesheets is bad news for SEO. As the initial crawl sees no data and only schedules a second crawl – which can take 1-3 weeks to arrive and which is more demanding on resources (hence more costly and less attractive to Google) – there are very clear down-sides to using client-side javascript rendering for a website.

My question to Barry was:

How does this type of javascript rendering work out for tools like Webshed who allow you to externalize the management of your SEO optimisation?

Barry talks about SEO split testing tools and the fact that they can be slow because they need to wait for the 2nd crawl and it’s rendering of the changed code. Thanks Barry 😊

Online marketing and the AI perspective

I change tracks for the 2nd sessions and go to the BingAds track. They have teamed up with some people from WPP – disclaimer: I used to work there – to build this session. It is a half French, half English session. We get insights into some of the things that Microsoft are doing from Alex Sinson. Very impressive indeed. Richard George @richgeorge from Wavemaker steps in to give insights into AI and Big Data. Admittedly, a lot of the content presented is not reeeaaaaaly about Artificial Intelligence and more about Automation and the overall technology changes we are facing – but it is a great way to make the topic concrete. I absolutely love the case study on breastfeeding in the UK where Wavemaker worked with the British Government to create a Skill on Alexa (Amazon) – it is like a voice chat bot that works around the clock and especially between 2 and 5 in the morning when breastfeeding mothers need advice and comfort the most. Great use of technology for a noble cause.

My question to Alex Sinson of Bing @BingAdsFR:

what are your thoughts on Quantum computing. Recently, I heard people from Google talk about Quantum computing becoming a reality within 12 months ?

No official announcement from Bing here but an acknowledgement of some major shifts coming up. Richard George adds in to confirm the importance of Quantum computing to solve some of our upcoming challenges in terms of data handling.

Thanks for a great session from Alex Sinson and Richard George.

The new challenges for Paid Search (Expert panel)

I am staying on the BingAds track as they have set up a panel discussion on a subject I really care about : major trends in Paid Search. This is the topic I will be presenting on the European Search Conference in Liverpool so of course I will be listening very attentively to the panel discussion.

Some extracts from the panel inputs:

Oui.sncf : Damien Boistuaud explains how their main challenge is to qualify audiences and target the younger generation. Automation is a big thing for them this year as they are running Dynamic campaigns and Dynamic ads.

SFR : Benito Donison is not convinced by the concept of User journey as he feels the tracking tools do not perform well enough to support that approach. He is more focused on integration Drive to store into marketing mix.

The Experts panel – best photo I could get…

The founder of Resoneo, Richard Strul @RichardSTRUL, gives us the best input (disclaimer: we are good friends and have known each other for 10+ years) : Main challenges are related to Audience qualification and segmentation. In the case of their client Allopneus, they aimed to push the qualification of 23% up and managed to reach 60-70% by integrated data from all the ad platforms. This created a new challenge of work load and forced them to do intensive Automation in the set-up. Richard also stressed the rise of Shopping representing sometimes up to 60% of the full ad budget for some clients

I liked the session and was happy to see a lot of the themes I will be presenting in Liverpool mentioned:

  • User journey
  • AI
  • Audience segmentation
  • Automation

Keynote with Andrey Lipattsev @andrey_l1nd3n, Google

How can you not love what this man tells us :

« We love the Web because it is the biggest collaborative project in Human history. ».

Where we do diverge in bit in our opinons, Andrey and I, is that he believes AMP (Accelerated Mobile pages) and PWA (Progressive Web Apps) are major improvements to the web experience, where I see them as temporary patches to something we haven’t quite been able to fix : the free and open internet.

IA tools for SEO

I was curious about this session. On one hand because of the title which didn’t quite make sense to me; I was really not expecting SEOs to be thinking about AI tools at this stage – generally speaking, I find the subject of IA hyped and exaggerated. But on the other hand, I was curious due to fact that the speakers are two very clever and very respected Search Professionals, Sylvain @speyronnet ‏and Guillaume Peyronnet @GPeyronnet. The original Search Bros 😉

This ended up being one of my absolute highlights of the day as the guys explained who Search Engines convert phrases into vectors and how these mathematical objects are being manipulated and handled to create search results. And in consequence, how the only way to create semantic clusters are by ways of using similarly machine-learning driven approaches to generate the keywords you want to use to create the perfect cluster enabling you to hit the algorithm.

The Peyronnet brothers explained the algorithm of Rocchio and the importance of user rating feedback into the ranking system. They also demonstrated how you could use machine learning techniques, decision treees, « R » and « Random Forest » techniques to do your own ranking factors study. As always, the data input makes all the difference and is the biggest challenge for any research you do. You need high quality, homogene data in decent volumes to be able to make any real findings.

I am surely not doing full justice to this session by means of these explanations but I was truly inspired by this very technical and in-depth presentation that pretty much explained how Rankbrain probably works.

This was the session where I didn’t ask any questions. Thanks guys !

Microsoft Bing outlook

For the last session, I went back to the Bing Ads track. I was interested in understanding where Microsoft are going and this session was giving a full perspective of the state of the art at Microsoft. Microsoft presented the « Microsoft Graph » – the mapping of the various data points they can use for their various Advertising solutions. Worth noting that LinkedIn is now part of that mix allowing you to target Companies, Professions, Job titles in your digital marketing.

 

During the day the concept of AI As a Service (AIAS), had really dawned on my. IBM Watson is a remote service and Google’s ML services are tools you can plug into. Microsoft explains this as their « Cognitive Services » and these are also tools that you can access via APIs

Microsoft demonstrated a number of application based on face recognition and sentiment but my preferred case was the video depicting a blind Microsoft engineer who has developed a tool by which he uses Microsoft AI to inform him about his surrounding by the use of photos, image recognition and sound restitution.

My question to Bing :

I am really impressed with your use of AI for image processing and voice. Are you using any of these tools to identify things like Propaganda, Fake news and Fake profiles?

The answer : all of our AI tools in Microsoft originated from Bing which is where we have had the most use for these tools. They are applied to things we see on the web and remove before we show them to users. We can not reveal how we do this.

I had hoped they would make some tools available to the wider public and maybe use human input so we can all Fight the Fake together…

SEMY Awards

The day ended with the SEMY Awards ceremony celebrating the best French case studies in various categories. This is a great addition to the SMX conference and I have great respect for the various winners of the Awards. We had an odd moment of power outing but got safely through to the networking cocktail at the end.

Want to follow?

A long day at SMX full of inspiration and great discussions. I have been tweeting a lot of this during the day on @soanders. Some in English some in French. Please follow me on Twitter if you find this useful.

Or read on. Day 2 is here: Penalty recovery, snaploggin, e-privay, chatbos and DSA

SMX London 2018: Artificial Intelligence, Privacy, Niche-focus for SEO and the importance of Speed

SMX London 2018 was a great edition. I am just back from the event and still have my head full of all the things we discussed on the closing panel I was part of.

The main topics were of course Privacy and GDPR on one hand and then Automation and Artificial Intelligence on the other. In the conference programme, various topics touched on AI: Automation for PPC and Voice Search being the most prominent ones. But GDPR was discussed in every break, over every drink and in most of the questions asked during the individual sessions.

 

The conference addressed some really interesting questions this year:

  • Should you use Ranking factor studies to guide the way you do SEO or simply use for inspiration?
  • What are your options when organic reach in Facebook and other social media is dropping?
  • Is AMP for mobile rendering or or just a temporary patch to a problem of speed?
  • Should we be worried as marketers with the arrival of AI in the optimization interfaces?

Overall, there was a lot of discussion of Speed. Speed as a ranking factor – not time to first byte but time to full page render as Marcela de Vivo from Semrush stressed. There were session on AMP which are in essence a way to speed up pages.

At the beginning of the year, my prediction was that AI would be big but that it would be more of a hype effect than a reality for the Digital Marketer this year. Well, it looks like AI is more prominent than that. Frederick Valleys from Optymzr reminded us that the Quality score was the first Machine learning functionality in Adwords and has been around since the beginning of Paid Search. And we have seen new ML driven functionalities enter the scene little by little. The Smart bidding options (Target PPC, Target ROAS, Maximize Clicks and Maximise conversions) are all part of this, as Brad Geddes reminded us. And in a separate session, Ann Stanley showed an example of smart bidding for shopping campaigns involving remarketing: Optimize by Goals which you can find here.

What we won’t see is perhaps the shift from Machine Learning (ML) into Artificial Intelligence (AI) for these functionalities.

Voice search is another area where Artificial Intelligence has a big role to play. It was covered in the keynote by Beshad Bezhadi but also at a more concete level in a very popular session by Pete Campbell:

As is often the case, I met some wonderful people in the networking around the event – and also had some of the more interesting discussions off the record.

  • Changes in the Google algorithm favouring niche approaches
  • The need to constantly renew ourselves in this industry of constant change
  • The need of business in other industries to learn from our experience as they will soon be facing the same challenges of constant change that we have for 20 years due to major technological, organisation and behavioural disruption across all business sectors.

Soon it will be time for SMX Paris – the programme is very different and there is no real speaker overlap so it will be another exciting conference on Search, Social, Analytics and Digital marketing overall.

Content Marketing: is it SEO or SMO? (SMX Paris)

It is not a simple “either… or…” question we decided to deal with on SMX Paris this year together with Erick Hostacy from Yourastar. My background is Search Marketing whereas Ericks background is Social Media.

 


We decided to deal with the subject in a provocative way. I would be the SEO-man and Erick would be the SMO-man. Each of us introduced the subject explaining how “Content Marketing is SEO” and “Content Marketing is SMO” to then alternately present our Proof Case Studies illustrating our case and making nasty allbeit almost politically correct comments on each other’s cases.

I have extracted my part of the presentation below and will embed Erick’s presentation if he decides to publish it also.

Thanks to our moderator Annie Lichtner and thanks to Erick for a session people seemed to enjoy.
We had a lot of fun 🙂

That animal-named update of the biggest search engine which deals with entities

smxlondonJust home from SMX London, I have realised just how important some of the take-aways were. The biggest one for me was the session about Entity Search. A session moderated by Danny Sullivan with David Amerland and Justin Briggs speaking.

Before reading any further, do this :

  1. copy the title of this article
  2. use the title as a search query in Google*

Your result page will show the answer to the natural language query I used as a title for this article.

* since the article was published and the title was republished on inbound.org, the search results page has changed. Before indexation, Google would show an article by Danny Sullivan about Hummingbird and Entity Search. Now it is this article which comes out top, at least if the author is in your circles on G+

From Recovery to Discovery

For those in search marketing, you will realise that this is new. It seems like search engines have always been based on keywords and served a purpose of navigational recovery: the search result will show me results corresponding to what is known but hidden somewhere (for example in my mind, in my bookmarks)

With a search results page adding something not outspoken to my query, I no longer need to use the concept of keywords, I can move on to a natural language querying and the function of the Search engine will allow for a function of Discovery. Finding something I didn’t know of.

The Art of Not using Keywords

I find this inspiring and started playing around with some queries. Justin Briggs gave me the first couple:

These queries are all about films because this semantic territory is very structured with entities and properties easy to define but I will come with examples from an other territory further ahead.

From Keywords to Entities

Where the Search paradigm used to be based on Keywords, their pro-eminence in the pages and in the backlinks, we seem to have passed beyond this in. The queries do not name the answer we are looking for whereas the search results page does. We are perhaps seeing the result for the « Entity » that has been identified as the answer to the query.

So what is Entity Search?

On the SMX, David Amerland made a reference to the US Patent number 8538984, which supposedly is the foundation for this Google update. The Patent can be found here as well: http://www.patentbuddy.com/Patent/8538984

As opposed to other recent animal-named Google algorithm updates, Panda and Penguin which are considered as add-ons, the Hummingbird is a complete rewrite.

What the Hummingbird update does is to move away from the concept of keywords and into the underlying entities. A search query is analysed in order to understand the meaning behind the keywords within and the search results are based on the presence of entities.

We see this effect more clearly on natural language searches using more words in the query. It also likely works more effectively in structured semantic territories. Justin Briggs pointed to Freebase (a Google acquisition) as a likely source of structure to the entities.

Exploring Entities

Let’s have a look into Freebase and the structure of Entities. It is almost noon so I was starting to think about food. Here is a view of the Entity « Dish » in Freebase :

2014-05-19 11-23-35 ScreenshotType of dish: starter, dessert, …

Cuisine: French, Indian, …

Ingredients: Eggs, Ham, …

Recipes

 

So, Dish is an Entity under Food and it has the properties: Type, Cuisine, Ingredients and Recipes.

This inspired me to test this query :

« how to prepare that mexican starter with fish, shrimps and lime juice »

2014-05-19 11-37-59 Screenshot

The result on top comes out as a Mexican Recipe for Ceviche. Out of the 12 word query only one of the words appear in the title and I see entity attributes highlighted in the snippet : shrimp, fish, lime, dessert but also, disappointingly, the word « make » which is by no means neither a keyword, nor an entity property.

The Future of Search?

With a number of changes having occured within Search Marketing and especially SEO during the past couple of years, I hadn’t really noticed this under-the-hood change. And I don’t think this will change Search Marketing instantly as I believe the shift to Entity-driven results will be gradual. Most users have now been “keyword search” educated and it will take some time to unlearn. SEOs, however, should start shifting tracks now. Keyword research has become obsolete – we need to look for Entities, Properties and Attributes.

Read more :

http://justinbriggs.org/entity-search-results-the-on-going-evolution-of-search

http://nlp.stanford.edu/software/CRF-NER.shtml

http://www.freebase.com/

http://www.seobythesea.com/2012/06/search-engines-and-entities/

http://searchengineland.com/killer-seo-string-entity-optimization-171094

 

SMX France, Paris 2010

Aside

June 2010, Paris France
Speaking on:
» Réseaux sociaux, recherche et gestion de la réputation

 

Moderating
» Les indicateurs de classement SEO en 2011
» SEO pour les sites web de grande envergure
» Contenu dupliqué (duplicate content) & tag « canonical »
» Internaliser SEO : Comment réussir ?
» Au-delà du « Linkbuilding » habituel

SMX Advanced, London 2010

Aside

SMXMay 2010, London, United Kingdom
» Yahoo-Microsoft: The new Search Powerhouse
» Moderator: Leveraging Digital Assets For Maximum SEO Impact
» Moderator: Search Ad Quality, Under The Microscope