Background of SEO
No More Need to Trash the Flash
By Kalena Jordan in Kalena Jordan's Blog
Email This Post
Google has delighted web designers the world over with the announcement that they are now indexing Adobe Flash files better than ever before.
Google can now index any kind of textual content embedded in Flash files. They can also discover URLs. From the official Google blog post:
“Q: What content can Google better index from these Flash files?
All of the text that users can see as they interact with your Flash file. If your website contains Flash, the textual content in your Flash files can be used when Google generates a snippet for your website. Also, the words that appear in your Flash files can be used to match query terms in Google searches.
In addition to finding and indexing the textual content in Flash files, we’re also discovering URLs that appear in Flash files, and feeding them into our crawling pipeline—just like we do with URLs that appear in non-Flash webpages. For example, if your Flash application contains links to pages inside your website, Google may now be better able to discover and crawl more of your website.”
The new algorithm was made possible thanks to Adobe’s new Searchable SWF library:
“We’ve developed an algorithm that explores Flash files in the same way that a person would, by clicking buttons, entering input, and so on. Our algorithm remembers all of the text that it encounters along the way, and that content is then available to be indexed. We can’t tell you all of the proprietary details, but we can tell you that the algorithm’s effectiveness was improved by utilizing Adobe’s new Searchable SWF library.”
Search EngineOptimization (SEO) is very difficult, but it will soon get much,much worse.
Six yearsago you could easily count the firms doing SEO work. The number ofsites competing for each search term were fewer, and the state ofthe art spamming tricks centered on white-on-white text and multipletitle tags. Getting a top ranking out of the 50,000 results returnedwas relatively easy to accomplish because those 50k results weremost often poorly optimized. Getting into the top 10 meant being inthe top 0.02% of the 50k results, which is certainly not trivial,but easily accomplished when facing naive competition. With a littleeffort site owners could do it themselves.
Now the tip of the iceberg is larger. Time has caused two things tohappen: the SEO competition is trickier than 6 years ago, and thepopulation of web sites is larger. There are now around 250,000results for most 2-word searches, and the new sites are often tunedby SEO practitioners. Instead of needing to be in the top 0.02%, younow need to be in the top 0.005% of a more competitive group ofpages. Such rankings are still possible, but beyond the ability ofmost site owners. You still need to be in the top 0.008% of allsites to be top 10.
But now we see that it is truly the tip of an iceberg. What we havenever before seen is a massive amount of hidden content thatpreviously resided behind the barriers inherent to dynamic contentweb sites. It has been estimated that the web is actually 500 timeslarger than the number of pages spidered to date. For the searchengines the good news is that the 90:10 rule applies, and that byadding only 50 times more pages, instead of the 500 times predictedto exist as hidden pages, they will account for 90% of thefrequently used content on the web.
For SEO practitioners, it means that rapid growth in the number ofindexed pages is imminent. As a result, getting a top 10 rankingwill soon mean getting a site ranked in the top 0.0002% of theresults. Obviously this is not work for lesser SEO practitioners,and certainly beyond the capabilities of most site owners. In fact,I suspect that most SEO firms will be unable to satisfy theirclients ranking requests (close, but no top 10) and there will besignificant client dissatisfaction with SEO results, employing 3 ormore SEO practitioners without success.
For web design firms, they will find that the rush to the web willcome to a crawl. Many web designers will be hard pressed to find newbusiness as prospective clients find search engine ranking beyondtheir financial reach. If so, I expect there to be renewed focus onalternate promotion activities such as PPC engines, ads, and yes,maybe even resurrected banners for the initial nine months or so foreach SEO project. Now the numbers game starts. Suppose there are200k results today to a 2-word query, and suppose there are only 50times as many pages for each query.
I believe it is realistic to find ourselves with ten million pagesin response to each inquiry within one to two years. Now suppose theaverage page starts in the middle (5,000,000), and that the averageranking can always be increased 90% with each consecutive"improvement". That means that after tweak 1 the ranking improves by90% to 500k, tweak 2: ranking is now 50k, tweak 3: 5k, tweak 4: 500,tweak 5: 50, and tweak 6 may result in a top 10 ranking. Nobody canget 6 consecutive 90% gains facing increased competition fromoptimized sites without the right analysis tools and methodologies,and many tweaks may prove ineffective.
In fact, the effectiveness of consecutive tweaks diminishes, suchthat tweak one is 90%, two is maybe 75%, three is maybe 60% and soon. Remember, your competition is tweaking at the same time, and thesearch engines are constantly refining their algorithms to keep pacewith this new content. And each consecutive tweak is more difficultthan the previous due to the quality and quantity of thecompetition. This means that only the top SEO practitioners can everattain a top 10 result for a meaningful keyword. The rest will justfade away.
Aside from the additional number of tweaks, each with a submissionand spidering cycle of over three weeks (making this a much longerprocess than today, probably doubling the project schedule a yearfrom now), the precision needed to rank a page well in one searchengine will certainly disqualify that same page from a simultaneoustop ranking in many search engines.
Pages that rank well today in many engines will find that theiraggregated ranking will erode and sites will have to settle forranking in only the three or four engines at a time. If a clientwants other engines to rank their site, then they must tuneadditional pages, thus they will have to expand their content andSearch Engine Optimization base to include many more pages withintheir site. Optimizing more pages is certainly the way to go, but itdoubles or triples the work involved by the SEO practitioner. As aresult, SEO practitioners have a much more difficult battle, theyrequire much more sophisticated. integrated tools, and projectsrequire more time.
They must optimize more pages, and thus they must inherently chargemuch more than today. If this is done top rankings are still verypossible, but this becomes the realm of only the exceptionallycompetent (or very lucky) SEO practitioner. Expect a significantgrowth in the number of indexed pages and expect a fallout of thoseSEO practitioners that are no longer viable. You can also expectlonger Search Engine Optimization schedules, expect fewer toprankings per optimized page (necessitating larger projects), andexpect an increase in pricing of at least triple that paid today.
What this does to the entire web industry is to scare off thosewithout the funds to participate in a competent SEO effort. What wasonce thought of as free now has a high cost-of-entry. And this willkill the web as a golden goose. It will cost much more to make moneyon the web, just like in a "real business".
The rewards are getting larger, but so is the cost. And as with theEmperors New Clothes, many SEO practitioners have been reluctant todiscuss this transformation for fear of getting hurt (scaring offcustomers). I, for one, am not afraid to roll up my sleeves and dothe hard work for the just rewards. But not everybody is as capable.Many in the SEO industry are standing there naked, and it isn't apretty picture. But it is time to expose the difficulties associatedwith SEO.