7 December 2008 at 23:44 UTC,
filed under Hack
The last time I wrote about a hacked site, it was using a redirect that sent some users to a different site. This kind of hack is pretty common (even though it’s usually not as complex as mentioned in that post), it leverages the sad fact that users are often easy to trick and not browsing with protection (or a current browser).
A different angle of attack is to redirect only search engine crawlers to a different site. By doing this, they can make it look like the pages of a website moved to a new domain name. In general, when search engines find redirects like that, they will more or less pass the “value” that a page had on to the new URL — that generally also applies to PageRank. So in a sense, they are trying to steal the value that a webmaster has built up over time.
In this particular case, a “massive amount” of sites were hacked and likely redirected through suomi.co.in.
Continue reading ‘Hackers stealing your PageRank’ »
5 September 2008 at 22:31 UTC,
filed under CSS
Here’s a simple trick to view nofollow links in Google Chrome. Just drag and drop the following button to your bookmark bar and hit it whenever you want to see links with the rel=nofollow HTML microformat:
This bookmarklet inserts a tiny bit of CSS into the top of the page you’re currently viewing. The CSS is similar to that which is used in other nofollow highlighting methods:
Continue reading ‘Seeing nofollow links in Google Chrome’ »
20 May 2008 at 23:27 UTC,
filed under Tricks
Here’s something from my mailbox – someone wanted to know how he could crawl his site and confirm that all of his pages really have the Google Analytics tracking-code on them. WordPress users have it easy, there are plugins that handle it automatically. Sometimes it’s worth asking nicely :) – let me show you how I did it. As a bonus, I’ll also show how you can check the AdSense ID on your pages, if you’re worried that you copy/pasted it incorrectly.
This is pretty much cross-platform, but as a Windows-user you’ll have to grab and install two files first:
- wget – a tool to download copies of web pages
- UnxTools – a collection of popular Unix/Linux tools for the hacker in you
Extract the ZIP files, copy the contents somewhere where you can find it and make sure that the appropriate folders are in your “path” (the files you’ll need for UnxTools are in “…\usr\local\wbin”). We’ll need to access these tools through the command line. I have a feeling I may need to elaborate on that for Windows users :) — let me know if that’s the case.
First, we’ll mirror our site on our local machine (this assumes that your site is crawlable; if it isn’t, then fix it first :D ):
Continue reading ‘Confirm that you’re using Analytics on all pages’ »
1 February 2008 at 23:14 UTC,
filed under Tricks
I’ve been trying to do that for years and did the most exotic things to make it happen. I’ve used four different browsers in parallel and I’ve even used a virtual PC running within my PC (that kind of defeats the desire to use less memory, but it feels neat anyway). In the end, a collegue in the office, who happens to use emacs as his main web browser :D , pointed me into the right direction.
Now I have three completely independant instances of Firefox running at the same time!
So what’s the trick?
Continue reading ‘Running Firefox in parallel’ »