256 Kilobytes

Find infinite Expired Web 2.0's with 0 skill

Articles in Search Engine Optimization | By Hash Brown

Published 1 week agoTue, 11 Jun 2019 09:20:08 -0700 | Last update 1 week agoTue, 11 Jun 2019 09:20:56 -0700 📌

Do you want a PBN without the expense, domains, difficulty and other stuff you can't be bothered with? Try this.

85 views, 3 RAMs, and 3 comments

Did you read this article and think "fuck I want that, but I don't have $1000's to throw at a PBN"?

Do you want to rank on Google with little to no money?

Do you have a copy of Scrapebox?

Great, then today I will show you some simple steps to finding INFINITE expired WEB 2.0 LINKS!

Web 2.0 was when the internet moved from shitty static HTML websites with spinning cat gifs, to a more dynamic arena with user generated content.

Websites like Angelfire and Geo Cities allowed users to create their own parts of the internet on subdomains. These little websites were used by people for all sorts of reasons.

Eventually these platforms were replaced with Tumblr, Overblog and Weebly. Lots of these platforms exist, they are all good for our needs.

What does "expired" mean?

On many platforms like Tumblr and others, if the owner of that particular username does not log in to their account for 1 years the account is deleted and the username that was once registered is available again.

This leaves the possibility that high IQ SEO's will come along, re-register that name, put our content and links on that URL and enjoy the free, easy to build, high authority links.

What are good platforms to scrape?

There is literally a whole bunch:

Sticking to the well known platforms will help later on getting things indexed and there is a bigger pool of usernames to pick from, but it also means that every other fucker trying to get these domains will be going after them too. It's good to get a mix and regularly try scraping different platforms to stay ahead as well as diversify your links.

Why do expired web 2.0's work better for SEO?

Because they have links.

What makes a good expired web 2.0?

Links.

With Web 2.0's I've always pushed my luck a little more than with a PBN domain (read that guide). I'm not sure why... I tend to worry less about relevancy and focus more on raw authority and links, but I still avoid the obviously spammed profiles.

These links are less effective than PBN links, but I also think they are less risky and offer a better ROI (they are literally free, except a little content and time). You don't have to worry about hosting or anything techichal. Just use a VPN when registering these accounts and you're away into the land of high rankings.

Tutorial

We need two things:

  • Scrapebox
  • Ahrefs

Scrapebox finds the web 2.0's for us, Ahrefs tells us if they are any good.

Step 1: Boot up Scrapebox, get some keywords

We need some keywords to seed Scrapebox with, ultimately for this tutorial it's not all that critical that we get the "best" keywords but one quick way to do this is to go into Ahrefs, export all the keywords your website ranks for and import these into Scrapebox.

So, go and do that. Export the organic keywords from your website currently. If you have less than 50,000 keywords also go and export your competitors too. The more keywords you have the better, scale is important for this tutorial because as you will see later we have to sort through a lot of sand to find the gold.

You will have to copy the keyword column and save it into a text file to import into scrapebox. If you can't do this you are not capable of the rest of this tutorial.

Click "import" and import from file, this is inside the harvester section.

For the sake of the tutorial I am being lazy, there is 6000 keywords there. I don't expect to find many with so few keywords.

Also, save your keyword list. Theres no need to keep repeating this step as you can use the same keyword list repeatedly on different platforms.

Note: Scrapebox has a keyword harvester. You absolutely can use this also but literally everyone does this. You will just get the same keywords with the same results. There are also "keyword lists" that offer already exported text files... This is horse shit. Do it properly.

Step 2: Harvest Some URLs

Scrapebox gives us the option for custom footprints, we want to use this.

The "site:domain.com" command will bring back results from only that domain. Swap weebly.com with whatever platform you want to scrape first.

Next up we are going to click on harvest, you will see a screen like below. Before you press start click on Proxies and make sure "Use server proxies" is ticked.

Without proxies you will be blocked from scraping very quickly. Server proxies are public proxies that the Scrapebox server finds for us, they are often shared between everyone. If you already have proxies that will work you can go ahead and use those, you will probably have better results.

The final step before we press start is to pick our search engines on the left hand side of this screen, 

As you can see I have chosen:

  • Bing
  • Yahoo
  • MyWebSearch

> OMFG what a noob these are terrible Y no big G? AKA Google?

Google is hard to scrape. They are very good at detecting robots and it's not worth the effort. These search engines are way easier.

Now press start.

Step 3: Wait 12-24 hours

> WTF R U SRS? I ND LINKZ NW!!!

Yes I am serious. It takes a long time.

Did you know that 24 hours is 1440 minutes.

In seconds that is 86,400.

Step 4: Lets see how many we have...

Once your scraping is complete, hopefully you have millions of URLs harvested. 

I have 250,000 from about 5 hours of scrapings. I have to stress, do this properly. Wait 24 hours minimum. This is the kind of task you can just set up on a VPS and check in on once a week and let run again. For a $15/month server it's worth every penny.

Once your scraping is done press exit, you don't need to export or save anything. Scrapebox will put the URLs into harvested section for you.

From here we want to trim all of our domains to root, so click Trim > Trim To Root.

And then Remove > Remove Duplicate URLs

This took me from 250,000 URLs to 50,000 URLS.... Now you see why we need millions to do this properly.

Next we need a tool to check if these URLs are available for us to register.

In the top menu click Addons > Show Available Addons and install "Scrapebox Alive Checker".

Go back to Addons and click on the now available "Scrapebox Alive Checker", it will open a new window for you.

In the bottom left you can import URLs from the Harvester and press Start.

When this finishes you should now have a list of URLs from your chosen Web 2.0 platform that:

  • Are available to register
  • Have links somewhere on the web (or scrapebox would not have found them)
  • Hopefully have some sort of history related to your niche as we found them by using keywords you already rank for.

Step 5: Export and run through Ahrefs

This is pretty self explanatory, we are going to run through the usernames that Scrapbox marked as dead, put them into Ahrefs and export again so that we knew which Web 2.0's have the most authority.

In the bottom right of the "Alive Checker" you will see the export button, export the NOT ALIVE names and save it somewhere.

Open up this text file. From 250,000 original URLs I had scraped I am down to 1000 remaining.

Head to Ahrefs batch analysis tool. This will let us pass in 200 URLs at a time and get the information we want such as links and RD's.

Select prefix in the target mode and press go.

When the analysis is done export the report and save it do a folder on your C:/ drive called "merge". Repeat until you have run all your urls through Ahrefs.

Step 6: Combine and sort

If you're a windows master race member open up command prompt, this is a very easy way to combine our CSV reports into 1 file for Scrapbox. 

If you have never used this before, it's going to be fine.

First navigate to your C:/ drive, do this by simply typing:

cd ../../

cd = change directory, it's a pretty standard command on many platforms.#

Now use the same thing again to enter our merge folder:

cd merge

And finally the copy command, this matches all files in the folder that end in ".csv" and creates a new file with everything combined called "merged.csv"

copy *.csv  merged.csv

You should see something like this if done correctly.

Once this is done open the new "merged.csv" file in excel or Google sheets.

On the sheet there will be a lot of rubbish we don't need, get rid of all columns except for:

  • Target
  • Domain Rating
  • Domains
  • Total Backlinks
  • Total Keywords
  • Total Traffic

It should go from something like this:

To this (with table mode turned on if you're in excel):

Now what you pick depends on you, it's good to get a solid mix.

I tend to focus on Referring Domains and Links. I look for high numbers of both in a close 1:1 ratio. If you see a URL with 1 Referring domains and 40 links it means one website has linked 40 times to this URL. It is probably a low quality sidebar link or homepage link on a forum. That's not good.

It's also worth taking a look at keyword and traffic. Some of these usernames could have only just expired so picking them up while they are still in search results is fantastic. Google won't even know it changed hands...

I also tend to ignore anything with a Domain Rating of less than 10. This metric has flaws (they all do) but it's a quick way of looking at how good links are without having to look at every link.

This leaves me with 6 web 2.0's from 250,000 original URLs but they are super strong!

You're not really going to go wrong with these links. They are super fucking easy to build out and they are free (except for time). Go and get them all or someone else will.

    Step 7: Register your accounts

    Registering accounts is fairly straight forward but it will depend on the platform you're using. The general idea is:

    • Use multiple emails from a free provider like hotmail/yahoo/gmail. You can buy these for pennies to save time.
    • Use a VPN/Proxy on each registration, registering multiple accounts on the same day from the same place is a footprint.
    • Train VA's to do this because fuck doing it yourself, it's very boring.

    Organise your login information in a spreadsheet somewhere and make sure you log in every few months and update something. You don't want to find these available usernames, register them, set up content and links and then lose it because you didn't log in or something retarded.

    Conclusion

    This is a lowcost but powerful way of building links for all sorts of niches, these are B tier links for a reason. It's easy, requires very minimal effort and gives you more web properties that you may find uses for later on at no cost. You also don't have to worry about bullshit like hosting and other nonsense.

    You can do an awful lot with these links. Coupled with something like Quora, Guest Posts and you have solid SEO strategy for most beginners.

    Users Who Have Downloaded More RAM:
    August R. Garcia (1 week ago)
    eeeeeeee (1 week ago)
    yottabyte (19 hours ago)
    🐏 ⨉ 3
    Posted by Hash Brown 1 week ago

    Edit History

    • [2019-06-11 9:20 PDT] Hash Brown (1 week ago)
    🕓 Posted at 11 June, 2019 09:20 AM PDT

    Profile Photo - Hash Brown Hash Brown Internet Activist &... United State of Euro...
    🗎 54 🗨 363 🐏 155
    Staff
    Profile Photo - Scuffed Dog Scuffed Dog Horse racing tipster Horseshoe Bay
    🗎 0 🗨 43 🐏 26
    Quality User

    tnks foR ur tutori,al

    its been 18yr sins i lost my sadi
    12yr since lost bhw mascot dog
    exmod proxyfire 1891-1892

    proxie suplier to xxdevils community 1944-1945
    admIn of proxiecentral 1815 during batle of Waterloo

    Users Who Have Downloaded More RAM:
    August R. Garcia (1 week ago)
    eeeeeeee (1 week ago)
    🐏 ⨉ 2
    Posted by Scuffed Dog 1 week ago 🕓 Posted at 11 June, 2019 11:23 AM PDT
    Profile Photo - Hash Brown Hash Brown Internet Activist &... United State of Euro...
    🗎 54 🗨 363 🐏 155
    Staff

    Where did you lose your sadi?

    This is terrible

    Users Who Have Downloaded More RAM:
    eeeeeeee (1 week ago)
    🐏 ⨉ 1
    Posted by Hash Brown 1 week ago 🕓 Posted at 11 June, 2019 13:02 PM PDT

    "THAT DOG IS GETTING RAPED" - Terry A. Davis

    Profile Photo - August R. Garcia August R. Garcia LARPing as a Sysadmi... Portland, OR
    🗎 149 🗨 769 🐏 211
    Site Owner

    Also, here's the version of this article for Normans:

    Users Who Have Downloaded More RAM:
    Scuffed Dog (1 week ago)
    🐏 ⨉ 1
    Posted by August R. Garcia 1 week ago 🕓 Posted at 11 June, 2019 17:08 PM PDT

    Sir, I can do you a nice SEO.

    Post a New Comment

    To leave a comment, login to your account or create an account.

    Do you like having a good time?

    Read Quality Articles

    Read some quality articles. If you can manage to not get banned for like five minutes, you can even post your own articles.

    View Articles →

    Argue with People on the Internet

    Use your account to explain why people are wrong on the Internet forum.

    View Forum →

    Vandalize the Wiki

    Or don't. I'm not your dad.

    View Wiki →

    Ask and/or Answer Questions

    If someone asks a terrible question, post a LMGTFY link.

    View Answers →

    Make Some Money

    Hire freelancers and/or advertise your goods and/or services. Hire people directly. We're not a middleman or your dad. Manage your own business transactions.

    Register an Account
    You can also login to an existing account or recover your password. All use of this site is subject to terms outlined in the terms of service and privacy policy.