Does Text-Link-Ads destroy Google? (Is Google F’d?)

One of the best algorithms people have come up with for providing good search results is Google’s PageRank algorithm.  Simply put, the value of a specific page on a website is related to the value of the pages that link to the page.  So, if lots of people link to you, you get higher PageRank.  The key here is an actual, real, HTML anchor-text link.  No Javascript clicks, no banner ads, etc.  Those things don’t count.  The value of a page is related to how many REAL links there are to it.  Also important is the value of the pages that link to a specific page.  So, if a page gets linked to by a highly valued page, it gets ranked higher than a page that is linked to by a less valued site.  It’s pretty simple.  If a site is linked to a lot, it’s an authority.  If it’s an authority, and it links to you, well, you’re good too.  It’s fairly common knowledge that PageRank isn’t the ONLY thing that Google looks at, but, regardless of what anyone says, it is important.  Google conceived of the system, and they provide pretty good search results.  Everything’s great in the world.  Except… when it’s not.

Looking at web history, webmasters figured out the importance of links, and they did what they could to get them.  Link exchanges became common, and they still are.  Link spamming, especially on blogs, became common.  Google had to react, to make sure that their algorithms still worked.  If certain people are unscrupulously gathering links and are performing better in search results, Google ceases to become an independent provider of relevant search results.  If this happens too much, people stop using Google.  So, the point is that people have been trying to game Google, and Google has been reacting.  However, for the most part, Google’s been reacting to things like link spamming, blog spamming, and the like.  But now, things are changing.

Text Link Ads is a service that lets webmasters BUY links.  This is different.  Instead of swapping links or having a site’s content be editorially dictated, there’s now a market for buying links on sites with good PageRank.  Webmasters like it because it gives them another stream of revenue, people doing SEO like it because it gives them an easy way to build links.  Text Link Ads says that it’s not really about PageRank.  I disagree.  If it weren’t about PR, then why not build something resembling a banner ad system?  Why not value links based on clicks or impressions?  Because you want real anchor text links that translate into transferred PageRank, that’s why. (Note: I’m not saying there are not other reasons for buying TLA’s, such as traffic.  But If PR weren’t a large part of this equation, why not just have a more traditional ad system?)

Is this bad?  My guess would be that Google probably thinks so.  Google wants their search results to be good, and anything that potentially hurts that is a “bad” thing.  However, the case can be made that this actually isn’t so bad.  For one, another revenue source brings more revenue to websites, promoting better content.  Also, TLA has editorial approval of links (and sites displaying links), and for the most part, the links actually conform somewhat to the niche they’re being displayed in.  Here’s an analogy:  You can pay Yahoo to be in their commercial directory, and Google counts this as ok.  They even suggest it.  The idea is that if you pay Yahoo to list you, and they do list you, it’s approval that you’re good.  You’ve spent the money to prove that you’re not a fly-by-night scam or spam operation, and Yahoo has independently confirmed.  Google likes this.  Take this to the next level.  Would Google approve of other directories doing the same?  Why?  I think the same rules apply. Take it another step — to where Text Link Ads is.  They’re still like a directory in that they place links in niches.  There’s still editorial control, most of the links are in-genre, the real money exchanged is a barrier to uberspam.  It’s just … decentralized.

I think TLA itself isn’t enough really hurt Google.  However, there are networks out there that take this idea another step towards spamminess.  Here’s what they do:  You give up links on your sites for links on other sites.  It’s just like TLA, but you get “paid” in links which you can only “spend” on other links.  These sites generally skim off the top and actually sell some of their inventory.  Also, they’re fairly sophisticated about how much “currency” you get for displaying links on your site.  Here’s why they’re bad:  There’s much less of a barrier to spam and lack of editorial control.  Once systems like these become automated enough, they’ll degenerate into spam.  On the other hand, maybe they won’t.  Maybe there’s enough money to be made and enough focus on keeping everything legitimate that they won’t really hurt google that much.

However, can it be said that something that gives some sites an advantage over others in the SERPS is not basically bad for Google?  In the end, don’t these sites exist to game Google?  Isn’t anything that games Google bad for Google?

Time will tell whether these networks thrive or whether Google figures out how to do some fancy graph analysis that penalizes network participants, thus killing the networks.

What do you think?  Is Google screwed by such practices? Will they evolve? Is a solution possible?  Does it require more computing power than is possible (even for the Googleplex) to amass?

Time will tell.

technorati tags:, ,

Flock is pretty cool

So i’m trying out flock, which is a new web browser based on firefox.  It’s got support for blogging, photos, delicious, rss reading, etc.

it seems pretty cool so far.

when i first heard about the project, months ago, i was very dismissive.  I thought that they were making a useless tool, that nobody would ever need anything more than firefox.  now, after playing with flock for 30 minutes or so, it seems really damn good.  I see exactly their vision, and I want to see MORE of it.  I know they can do a lot more with this project, and I’m excited about it.

I’m even more excited that firefox will see it and incorporate some of the better features back into firefox, making IT an even better product.

technorati tags:, , , ,

Low Memory Computing

There is a seemingly unstoppable trend in computing to have ever more and more memory available to applications. When we run across performance bottlenecks, one of the easiest fixes is usually to “add more ram.” However, there is a trend towards getting virtual private servers to host websites. This trend isn’t new. Businesses have been consolidating production servers onto virtual servers for a while now. The “new thing” is that more and more people are able to get their own slice of a real server from hosting companies. I just moved this blog, Urban Pug, and Clean Your Microfiber to a small virtual server from Quantact.com for a very affordable price. We’re at the point where you can split a decent server up 80-100 ways and give everyone decent performance.

There is a problem, however. If you put 8 gigs of ram in a system with, say, two high powered Xeon or Opteron servers, you can split the CPU cyles up and guarantee everyone a minimum amount of performance. This part is straightforward. The host system gives all guests as much CPU as they want, but when there’s contention, resources are limited with “fair” mechanism to ensure even CPU cycle distribution. However, with RAM, you can guarantee a “limited” amount of ram, but you can’t “burst” ram like you can with CPU cycles. The guest operating system can’t just be told, “hey, you have more ram now.”

That leaves us with the situation of having pretty damn fast virtual servers that are set to run with small amounts of ram. The problem that THIS creates is that standard applications such as MySQL and Apache expect a certain amount of ram to be present in modern configurations. When this happens, you can easily have a situation where you run out of ram and start swapping to disk a lot. If you’re in this situation, you might actually be better off limiting the applications in some way to use less ram (and thus potentially be slower under certain conditions).

It’s not a “win-win” situation. If you need a big fast server, you’ll still have to get a big fast server. If you just need something medium-to-small, it’s possible to do this on the cheap and still get good performance.

So, how do we do this? Basically, it comes down to limiting the ram MySQL uses and limiting the ram and number of processes apache uses (or using other applications altogether, like lighttpd). In my next few posts, i’ll discuss what i’ve learned from tweaking my own VPS, and hopefully get some feedback on how to do a better job.

Google News to RSS Feed Converter

This afternoon, I was really frustrated. I found an article I liked while looking through some financial sites, and I realized, “hey, i want to SUBSCRIBE to this article”

Well, i went to google news, thinking, “of course, they probably already have rss feeds of news searches, just like technorati lets you have of blog space.” Well, I couldn’t find it, so I looked around and found a couple tools, including RSSgenr8 by XMLHub.com. Then, in an hour or so I cooked together a little tool for me to use to track some news stories via my bloglines account.

Anyway, if it’s useful to me, it’s probably useful to someone else. If others find it useful, I may make the code look nice and release it as GPL.

For now, you can try it out: rss-a-tron-o-matic

Edit: Removed Link, google news supports RSS now.

Where will the browser appear as a platform first? — The Enterprise

Here’s a thought from the future:

  • The browser is the platform.
  • Microsoft has embraced this and has released Office as an ASP.NET 2.0 Ajax application for enterprises.
  • Microsoft is continuing its push into ‘Software as a Service’
  • You can now rent Office XML Application Server for Windows Server 2007
  • All of your enterprise users, using IE7.0, Firefox 2.1, Opera 16, or Safari can now access all their office applications from their desktop. (No, IE8.0 still won’t be out, but Firefox will be at 25% marketshare, and I’m not even going to guess at what Firefox will be alled then — how about ‘Burning Rabbit’ ?)
  • Here’s the catch — When users click on that ‘Microsoft Word’ button (or any of the office apps), a local application doesnt load. It loads a rich web application that closely mimics what we now think of as word.
  • All of the users have their own document storage on your Windows Server
  • All of the users have access to their documents seamlessly through existing methods (the remote storage automatically shows up in a user’s ‘my documents’ subfolder, apple’s finder/searchlight, etc)
  • Users can specify permissions on these centrally stored files, and they are easily shared — people don’t have to navigate to a random person’s desktop to get a document they shared, and a person doesn’t have to email it to them. The documents on the server are all searchable by the user’s local desktop (depending on permissions).
  • When it’s time to upgrade to a new version of Office XML Application Server, the upgrade is done on the server, once, and all clients automatically have their update.
  • I know some of this isn’t new OR likely, but it’s fun to take an old idea that was once pure ‘out there’ thinking and bring it down into the realm of “I see how this is possible even if it’s not either soon or likely”

    Also, who knows if it will even be microsoft who does this? Maybe it’s SUN, maybe this will all play out on linux desktops first, with “OpenOffice Network Server” — who knows. I think the day of the browser as a platform IS coming, and I think we’re going to see REAL productivity applications created this way, and I think it’s going to come to the enterprise first.

    They’re the ones who can see the real cost savings and increased productivity — through ease of deployment and upgrades for the former and ease of collaboration in the latter.