Question of the week:

What’s the best way to do indexation checking for large lists of URLs?

 

Second question of the week:

Are you using any backlink indexation/boosting systems? Are they working for you?

 

Cheers,

 

Please leave a comment below.

Note that I don’t reply to all comments, but I do read them all. If I don’t reply to your comment, it most likely means that I agree with what you say and don’t have anything meaningful to add, myself. I appreciate and read all comments and your comments play an important part in what I write about, so keep them coming.

Spammy comments are always filtered or removed. No surprise there.

  • This is the first time I’ve heard of backlink indexing. Would like to know what it is and why/if it’s important.

    • Oh, looks like I’ve got a topic for an upcoming blog post. :)

  • Hi Shane,
    Please use a good video sharing service to upload your videos.
    Video buffering speed is very slow.

    Thanks!

    • I don’t know, it played for me in under 3 seconds and the video never dragged throughout. Perhaps the issue is not with the video sharing site. (???)

    • Sorry if the video isn’t working for you. After having my Youtube account shut down out of the blue, I really don’t want to use a sharing site for the videos here anymore.

      I’m still using Youtube and other sharing sites for all of my “secondary” sites, but this here is my main site and I don’t want videos disappearing for no reason (again).

  • I haven’t used any SW but I know that SeNuke Sw have a Social bookmarking and RSS module that can bookmark large no of URLs, using different user social profiles. But uts very expensive. About $127/month

  • I just use IMautomator to link to some of my links. I haven’t yet got a decent tracking system set up to handle details…

    I seem to remember Paul Forcey selling something similar tho’ – be worth contacting him. And Andy Fletcher too…

    :-)

    Alex

  • Well, usually I don’t overtly try to index new websites.

    But I had to yesterday.

    Used the Rapid Indexer add on in ScrapeBox (ran it twice). It actually works like a pinging thing.

    And what do you know, 90% of the site got indexed today.

    It only took 1 day to index.

    Thing is, I can index as many sites as I want at the click of a button, making ScrapeBox very useful in terms of indexing new sites/pages.

  • Hey Shane – It would be great is you could come up with some sort of application that gets mass URL’s indexed in the 1st place. At the moment I’m using a number of different ways that don’t seem to be very coherent.

    First, I run the URLs through “SEO Fast Indexer”, which multi-pings them & then adds them to a ‘fast index list’.

    Next I take the URL’s & create RSS Feeds (100 url’s at a time) & then submit these feeds to RSS Bot (or sometimes I just use online services like: feedshark etc) I’ve also recently started submitting the url’s to the free version of Linklicious.me.

    My problem is that I’m not sure how effective all this is because I don’t have a way to easily test how many of these links actually get indexed.

    If you can come up with some kind of software app / system that gets large lists of URL’s indexed, and then checks that indexation, you’d be onto a winner for sure!

    Cheers, Neil

    • Thanks for your detailed input!

      I started a test of linklicious.me as well, but haven’t completed it because of the problem mentioned in the video.

      Sam (SECockpit guy) has been experimenting with a system for getting links indexed. A pretty clever one at that. Checking when the pages get crawled and indexed could be a part of that and maybe, just maybe, it will turn into a system we can offer for others to use at some point. :)

  • The problem with software that check if a backlink is indexed is that google blocks when you send too many automated queries. a proxy support is necessary

    • That’s true, without proxies, you can’t get through a long list of URLs.

  • What’s the best way to do indexation checking for large lists of URLs?
    – the ” Best Way ” Wow, I wish I was that smart! I do not know!

    Are you using any backlink indexation/boosting systems? Are they working for you?
    – I use Link Energizer with Set Cron Job. The whole approach is very simple and easy to use ( Web 2.o cluster sites and self hosted wordpress using attractive RSS feeds).
    – Yes, I believe Energizer is helping get my backlinks Cached indexed.

  • Hey Shane

    I have used Scrapebox, Linklicious, some online tools, I have had people do some Xrumer blasts and used the Senuke pinger. Probably tried others but they are the ones I can think of right now.

    There are many ways to get the indexing done. Pinging is OK but I would say that getting backlinks to your backlinks would be the most effective.

    I have heard that Backlinks Energizer is one of the best tools out there. Takes a bit of setting up but supposed to give great results. Is that one of the tools you are looking at?

    • Yeah, Backlink Energizer can be a bit of work to set up. But it’s a clever system for sure.
      That’s one of the systems I’d also publish a review on, yes.

      And I agree: backlinks to backlinks is the best solution.

  • Hi Shane

    Getting backinks indexed has become a bit of a personal quest. OK I need to get a life!!

    Having been doing seo for clients for quite a few years using articles, web2.0 sites etc etc with good success I decided to find out what was doing what. I was quite surprised to find that a lot of the articles, posts and profiles had never been indexed.

    So I started collecting the URL’s and fed them into my own software that checks to see if they are indexed. The software then produces lists of non-indexed url’s that I feed into various backlink boosting methods that I have tried.

    I used to use scrapeBox for this but found it to be unreliable. I think it uses inurl:http://….. to see if the page is indexed. You actually get a more reliable result if you search using site:http://…..

    Current fav is Backlink Energiser which uses posts to multiple Web2 cluster sites. This works if you leave it long enough. However it is interesting that in the tests I have run only certain of the web2 sites themselves ever get indexed! So the power is actually coming from 2-3 of the 8 sites the system uses. (This is assuming that these pages have to be indexed themselves rather than just being spidered to find the link pointing to your backlink).

    I am convinced that I am missing a big piece of the jigsaw. As certain sites/pages can get indexed by placing a single backlink somewhere. But others, especially profile pages and some article sites take a lot of links, pings etc to get them indexed. Some can take months.

    So keep asking the questions

    Rich

    • Ah…
      See, I didn’t know that Scrapebox uses the inurl command.

      Theoretically, you could do a site: search for all the URLs (using the merge function and the URLs as a keyword list) and then remove the duplicate domains, if necessary.

      I’ll have to test and see if that’s a better solution.

      Thanks for your input, Rich!

  • I’m not sure if the cache is replaced/created every time a page is changed/added to the main index. If it is, you might be able to use the footprint cache:[URL] and pull the first date from the returned page (proxies + SB + programming required).

    Best of luck – hope someone knows of a tool out there, I’d be really interested to see the results of your tests Shane!

    Justin

    • Thanks for the suggestion! I’m pretty sure the cache isn’t always updated, but it could still be very interesting to know how many of the backlink pages have been cached. It’s also a measure of how effective a link indexation tool is.

  • I have the opposite problem then you Shane,
    I can write code software in my sleep, but have no idea how to market it.

  • This indexation problem came to my attention just recently. Like Rich, I had no idea that indexing was a problem. After I discovered the problem, I thought of setting up Google alerts. But it’s a vague method. I need more solid proof.

    I will keep my eyes open. I hope to get information soon.

  • Hey Shane,

    I use scrapebox and Andy Black’s tools for checking. Prob is, put the same urls into each system for checking and you get different results back. I emailed the owner of linklicious and asked about using it to get profile links indexed and he very honestly said it probably wouldn’t work.

    I am using LJM and NLI. Of the 2, the support for NLB is much better and is hands off as the system itself tells you whether the link is indexed…And cached (which i feel is important) You can even click on a link manually and it will take you straight to the cached page and you can verify with your own eyes.

    Prob for me with NLB is that it can be slow to index links. KK says this is because of the type of links that are being put through the system and gives advice on getting the rate of indexation up. He also gives a guarantee of at least 50% indexation.

    For me personally, the jury is still out on LJM and I doubt if I will continue with it. My experience of support was less than brilliant. Also having to mess around in the interface with adding the urls a second time for checking indexation, (checking is done using Site:URL or Info:URL) plus having to use your own proxy list to do so, was not something I had expected for a service that costs what it does. Not a hands off solution by any means.

    The results of using the LJM index checker as with Scrapebox & Andy’s tools, produces different results at different times. With NLB you enter your project once and it does the indexation and checking for you. You can also very quickly click on a link for manual verification (takes you straight to the cached page) all from one place, and without having to scrape or pay for your own proxy list.

    The NLI interface has been cleaned up but could be further improved.

    Getting as many of the backlinks that we spend a lot of time/money building, indexed/cached is very important to get the maximum benefits from the time/money we invest in building them. But, if the process of getting that done and checking the results is time intensive, and the indexation results vary depending on how they are checked, then the system benefits are not optimal.

    I need something that has a fairly quick project setup and does its own checking using its own proxies/resources and gives the option of quickly manually checking, should you require it. For me (so far) NLI comes closest to a hands off system that produces results.

  • Oops…Where/when I referenced “NLB” in the above post, I mean’t “NLI,” Nuclear Link Indexer. I’m not an affiliate, although I should be?

  • Well indexing is like the Holy Grail to some..:). I’ve used them all. I use Linklicious (basic) and if you listen to those guys they believe just getting them crawled is pretty good. I also have BL engergizer but the cron jobs have been giving me headaches lately. Usually I use SB just to check PR and put the PR3 and above links into the engergizer bunny. You can also email links through GMAIL and the bot will be reading the mail, helps some. Manually posting links to a blogger blog and pinging those will help if you have the time..and of course backlinking the backlinks with Xrumer blasts and then you have to try to get those indexed…..

    UGH about that software solution…I will be a beta tester for anything you can dream up.

  • As far as backlinking my backlinks goes, I’m gonna start a test using this service in just a little while http://www.backlinkbooster.com/ (not affil). Anybody tried this yet? Looks like it will also track the backlinks in the app. and keep working them until they DO get indexed.

    As far as tracking backlinks, the game has permanently changed. Below is a post I made on the A100K forum about this. FYI, Traffic Travis is coming out with v4 that will use Majestic SEO data. And Market Samuari is already using MSEO data.
    ————–
    Has our ability to get accurate counts for backlinks gone the way of the dinosaur?

    If you go read this thread by Darren Warmuth of UAW http://www.uniquearticlewizard.com/blog/backlink-tracking/backlink-data-%E2%80%93-a-must-read%E2%80%A6.html,

    he talks about the Bing/Yahoo merger (Bing is the dominate one) and how this merger has permanently affected our ability to get an accurate count of backlinks for our or a competitors page(s), in that, Yahoo Site Explorer won’t be maintained any longer because of Bing’s known decision to not give out this data – at least in any accurate way (just like Google has always done).

    And if all tools such as Market Samurai, Micro Niche Finder and Traffic Travis et al (I own them all and prefer TT to do competition research) pull their backlink data from YSE, then they can no longer be counted on for accuracy either, and will get less and less accurate, until YSE is pulled down permanently at the end of this year.

    We can still get accurate data for PR and on-page optimization (Title, Description and H1 tag optimization) but the number of backlinks is a MAJOR consideration for us, which without accurate information will kill any and all attempts at accurate keyword research. And we all know that choosing the right keyword set — that you believe through research you can rank for — is Job #1.

    If all of this it true, then how will we be able to get accurate backlink counts when doing keyword research?

    Has anyone ever used http://www.majesticseo.com/ as suggested in the article? If so, can we configure this in such a way as to conform to A100K research criteria, as in getting the backlink data for Page 1 results all on one screen, without the costs associated with this type of data manipulation being exorbitant?

    Thanking you all in advance for your opinions and insight on this.
    —————–
    Here’s a email and response I got from Traffic Travis about this problem:

    At the moment, I’m a free user of TT and I must say that the SEO Analysis tab is my new favorite tool for checking on the Top 10 SERP’s for a chosen keyword phrase. It shows me everything I need to see in order to judge SEO competition; number of backlinks, PR, and Meta tag optimization.

    My problem is this: since the Bing/Yahoo merger, and the news that Bing will no longer be maintaining the backlink database for Yahoo Site Explorer (just like Google has always done; for whatever reasons they don’t show complete backlink data) and this functionality in YSE has already started to drastically deteriorate, how will TT accurately pull backlink data now? (I’m assuming that TT uses YSE’s API to pull this data.)

    Are you planning any upgrades, like, possibly a module where we could plugin our account details from, say, Majestic SEO, and then TT would pull and format the backlink data into TT like it always has?

    Please let me know your thoughts on this, as I really feel that all SEO tools that utilize YSE data have now become broken and/or highly inaccurate.

    Thank you for your time.

    Doug Dearing
    ———————
    Hi Doug,

    Thanks for patiently waiting.

    I heave heard from our technical team and they said:

    “Traffic Travis version 3 currently uses Yahoo for it’s backlinks and that won’t change anytime soon. However we are really excited to say that Traffic Travis version 4 will be using Majestic SEO for getting backlinks! We have reached an agreement with Majestic SEO so we are able to provide backlink figures and stats for websites, although the backlink urls themselves won’t come from Majestic but likely from SeoMoz.”

    We hope to release Traffic Travis version 4 within 8 weeks. It would have been released sooner but this was pushed back because of the recent events in Christchurch.

    I hope this helps.

    Kind Regards,

    Fara
    Customer Support

    • Hi Doug.

      Some good information there.

      I tried to use backlinkbooster.com about 9 months ago.

      I found the interface to not be that user friendly and the RSS registration process was outdated and not very efficient.

      I know they have done some updates earlier this year, so it might be time for me to revisit backlinkbooster and see if it’s more usable and more importantly – effective.

  • Hey Doug,

    Not sure if you have seen Shane & Sam’s SEOCockpit. If you haven’t, its well worth a look… It takes it’s data from SEO MOz.

    http://www.seocockpit.com.

    I’m not an affiliate but I am a user.

    Cheers,

    Hugh.

  • Dear Shane:
    Tuesday, March 22nd, 2011 at 11:41 hours Yokohama, Japan

    Re: Guru Popcorn Vids + SE Cockpit Bootcamp #2 (where do I find it?) + how to get my Gravitar to show up here?

    Just wondering if it would be possible to show some of the Guru Popcorn videos that don’t show up on im->impact.com because of the You Tube decision?

    Do you have any watch-outs for us so to avoid being banned?

    Can you appeal the decision? I you like we can start a petition!

    Best,

    PV

  • We have to remember that Google uses many indexes. I’m beginning to suspect that looking at the cache is probably the best way for you Shane.

    Quote from http://googleblog.blogspot.com/2010/06/our-new-search-index-caffeine.html

    ‘Our old index had several layers, some of which were refreshed at a faster rate than others; the main layer would update every couple of weeks. To refresh a layer of the old index, we would analyze the entire web, which meant there was a significant delay between when we found a page and made it available to you.

    With Caffeine, we analyze the web in small portions and update our search index on a continuous basis, globally. As we find new pages, or new information on existing pages, we can add these straight to the index. That means you can find fresher information than ever before—no matter when or where it was published.’

    @Doug – I can also vouch for SECockpit – you should take a look. It’s the best keyword research tool I’ve used

  • {"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
    >