Top 8 Best AdSense Alternatives

So you’re doing good for yourself and making decent earnings off AdSense. But then one day you get that email in your inbox with the subject “AdSense Account Disabled” and everything changes. You suddenly realize that any other respectable Ad Networks either have low CPM or are just pure scams. So here I’ve compiled a list of all the other Advertising Networks I tried before I got my AdSense Account back.

8. Amazon Associates

While I am aware that Amazon Associates is technically an affiliate network I wanted to include it on this list simply because it’s a great source of extra revenue if you’ve got a site with a heavy and loyal following or a high conversion rate. It pays out similar to any other Affiliate program (based on sales) however since Amazon has such a large variety of products its easier to find relevant products than with other affiliate sites.

7. Chitika

Chitika pays via PayPal and allows you to customize your ads to better suit your websites like AdSense. The Downside to Chitika is its search-driven meaning the ads shown, are based on what Search Terms were used to get to your site not the keywords on your site.

6. InfoLinks

This is ad network creates blue underlined words like hyperlinks in your content automatically, and only shows ads when the links are hovered over by the user. Currently, InfoLinks doesn’t support any type of banner ads however they do have lower requirements so if your site is just starting up it might be worth a try.

5. ProjectWonderful

Despite having a rather strange name ProjectWonderful has a very unique model based on advertisers renting time on your site. The best part being that the system works on a bidding system so you usually end up making more money than you would with CPM or PPC Ads.

4. BuySellAds

Similar to ProjectWonderful, BuySellAds allows advertisers to pay for Ad space on your websites based on how much they are willing to pay or how much your minimum advertising cost is.

3. is AOL’s solution to advertising (yes AOL) and has become a very large and respectable ad network online, although they do have rather high requirements compared to some of the others listed above.

2. Tribal Fusion

This Ad Network is by far the king of the CPM Networks and requires about 500,000 Impressions per month. Although it has a high need it pays off well with some of the highest CPM Rates on the internet, even when compared to AdSense.

1. is part of the Yahoo | Bing Ad Network, and so is basically Microsoft’s equivalent of AdSense. While the exact requirements are unknown its very difficult to get invited into this ad network is so far we’ve only seen sites with millions of monthly users using.

Bad Eggs

During my time looking for Google Adsense alternatives, I also found a few spammy/scam Advertising networks that look legit but will end up not paying you or showing fewer clicks/impressions then your ads actually get.


While this site would convince anyone that it’s a legit ad-network I found that all of there ads were spammy download buttons, and sometimes opened up popups or tried to download malicious software to the visitor’s computer. I have also heard stories of people getting their entire sites blacklisted from Google as being malicious because of RevenueHits Ads.


While Adhexa will indeed pay you and has better quality ads then RevenueHits they often seem to only count a fraction of the impressions your actually getting which may simply be a software glitch on their servers but still stay away until they become more reputable.


DuckDuckGo VS Google

If you haven’t been following it then you might be surprised that a young start-up by the name of ‘DuckDuckGo’ is challenging the mighty search giant Google. But on February 19, 2008, the new privacy-focused search engine emerged and was incorporated. The main difference between DuckDuckGo and Google is that DuckDuckGo claims not to track your searches, and not to customize search results based on your previous searches. DuckDuckGo also offers many features that Google does not such as a Clean modern interface, Automatically Searching a site, and many more. However, it should be noted that most of the features that DuckDuckGo has that Google doesn’t aren’t really all that useful in your everyday searching experience. DuckDuckGo also does not rely heavily on a Web crawler, but instead API’s from other search engines like Bing, Yahoo, and smaller specialty search engines such as Sound Cloud.

While all this sounds great, Google still has the advantage over DuckDuckGo when it comes to actually finding you what you want. Because DuckDuckGo won’t track you, and won’t give you personalized results, you also won’t be getting as relevant results as you would by using Google. Another thing is that even with its many API’s and Search Sources DuckDuckGo has a much smaller Index than Google so when you search you will find far less relevant results simply because there is less diversity. Overall it comes down to Privacy VS Better Results, as DuckDuckGo offers Privacy, but Google offers better results.


Understanding Robots.txt

Almost every site on the web has a robots.txt file located at which tells web crawlers or bots what they are allowed to visit and what they are not allowed to visit. This can be helpful for blocking off admin areas or backend files that you do not need or want crawled or giving bots information like the URL(s) to your sitemap(s).

So what actually is a robots.txt? Well, it’s just a simple text file made in notepad or any other word processor that has several tags that web crawlers understand. The Following are some examples of common elements that you might see in a Robots.txt file.

# Basic Robots.txt allowing bots to visit your whole site
User-agent: *
# Robots file telling bots not to crawl any part of the site
User-agent: *
# Robots file telling bots not to crawl any files in the admin directory (backend)
User-agent: *
Disallow: /admin/

You can also use tags such as the Sitemap tag to tell bots where to find your XML sitemap or RSS feeds to increase crawl rates.

There are a few drawbacks to using a robots.txt file though, mainly being that it is public to anyone who wants it so if you are trying to hide an exposed backend directory or an admin area with the Disallow tag anybody can view your robots, and it does not stop regular users from visiting those pages. Another thing is that not all bots obey the Robots.txt file, these are mainly malicious bots that are trying to spam post comments or upload malware to your web servers. These bots may use your Robots file against you by visiting you disallow links in an attempt to crack your admin accounts.

In most cases the benefits of having an up to day robots.txt file out weight the drawbacks because of the increased search engine crawling from spiders like GoogleBot or BingBot. It also helps to give information about your site to smaller search engines that might not have a “webmaster tools” section like DuckDuckGo or AskClash about where your sitemaps are located.

Overall it’s up to you how much time you want to spend designing your robots.txt file and how you want to deal with the possible threats too using one.


DoFollow VS NoFollow

I feel that there is a lot of confusion as too exactly what DoFollow and NoFollow links do and don’t do. A common misconception is that if something is marked NoFollow then search engines won’t follow the link at all. This, however, is not true at all, because web crawlers simply aren’t going to pass up a potentially content rich page because its NoFollow. However, this doesn’t mean that search engines don’t treat the NoFollow links different than Dofollow.

Google Bot visiting a NoFollow Link

The real thing that the two types of links determine is the transfer of Page Rank or commonly referred to as “PR”. Page rank is how search engines determine how important a website is by counting the number of backlinks to the site. The idea behind this is that if your website has lots of other website linking too it then your website must have good and quality content, so it should be ranked higher in the search results. The higher number of backlinks you have to your site the higher Page Rank you will have.

NoFollow links are basically links that search engines don’t count as a backlink. This is a common practice on High PageRank sites because the higher the PageRank of the site linking to your the more it will affect your PageRank.

DoFollow tells the search engines to count the link as a backlink to the site and thus gives the site being linked to a higher PageRank based on the PR of the site linking to it.

Many sites on the web now use NoFollow links because of automated bots designed to browse the web and post links as an attempt to build backlinks to a certain domain, usually of questionable content.


Backlinks Explained

What is a Backlink? Well, its very simple actually, a Backlink is just a DoFollow link pointing to a website. I’t’s useful to have a collection of Backlinks pointing to your website because they are used by most major search engines such as Google, Bing, and Yahoo! to determine the trust level of your website.

The basic concept is that the more links you have going to your site the better quality the content on the site is and thus it should be ranked higher in search engines. Backlinks also transfer page rank from site to site based on a variety of factors. The major factor is the PageRank of the site who is linking to another site. PageRank is the quantifiable way for search engines to express how much they trust your website ranging from PR1 to PR 9.

If you are linked to by a site with very low PageRank (PR) that is considered a Low-Quality Backlink because it’s not coming from a reliable source and is thought of by web spiders as a spam backlink. There are also High-Quality Backlinks which give search engines a much better impression of your site because they usually come from PR6 to PR9 Websites that are already trusted by search engines. These Backlinks are also weighted more heavily than Low-Quality ones because they are coming from a source that is known not to spam links to low-quality sites.

There are a few exceptions to the Low and High-quality backlink rules though. The major one is Social Networking sites like Facebook that have high Pageranks but only give low-quality backlinks. The reasoning behind this is pretty simple, search engines give backlinks from these sites less value because they are usually mass posted, and can easily be posted by the website owner. This is not a bad thing though as if Backlinks from Social Media sites were worth PR9 status then a link in a Tweet would be the same as being mentioned in a post on or

The last factor that search engines look at is the mixture of High and Low-quality backlinks. What this means is if a site only has Low-Quality backlinks then chances are that site has been spamming small blogs and social media site. But if a site has a Mix of High and Low-Quality backlinks then it tells Google that your site is linked to being trustworthy sites and shared allot by its users and thus it must have super great content.


Microsofts New “HoloLens”

Today while I was watching YouTube I found an ad for the new Microsoft “HoloLens” and was instantly prompted to click the ad for more information. After some debate in various forums around the web, I determined that Microsoft has not yet released a Prototype of the device yet. Most of what we saw in the ad that Microsoft is airing was most likely simply computer-generated effects mixed with real world footage. Although I was of course intrigued by this new concept of synchronizing the real-world with the virtual one I cannot say that it would all be for the best. As of right now only with Smart Phones, and Google Glasses about 95% of all car accidents are caused by people distracted by technology. If this new technology were to become reality it would be best used while sitting down or working indoors in a non-hazardous environment.

With today’s modern technology I doubt that Microsoft will have the capabilities to build something like this for at least the next decade or two, but it’s a good innovative thought. I find that in its current stage the HoloLens is more relatable to something found in a Science Fiction movie than a real-world piece of technology.

We also must consider that Microsoft is currently struggling with software piracy and losing billions of dollars each and every day. Could this simply be the last attempt by a dying mega-corporation to survive the onslaught of piracy and harsh criticism from the media and public with their latest release of Windows 8?


Geenheart Games Solution to Piracy

Just a few days ago I read a startling post by Patrick from Greenheart Games stating some startling statistics from their latest game release “Game Dev Tycoon”. In the article, Patrick explains an experiment that Greenheart conducted on the day of the PC ported release of the game involving baiting pirates to illegally download a purposely leaked version of the game containing a slight change in the story and a tracking code sending anonymous data to Greenheart servers. After the first day, the results were simply jaw-dropping, with 93.6% of all copies of the game being cracked and only 6.4% having actually purchased the game. This was especially surprising because of the extremely cheap price of the game on stores like Steam and the Windows Store of only $8 USD.

In late 2014 the CEO of Ubisoft games gave an interview to Gamespot regarding a similar study with Ubisoft games which gave remarkably similar results. A grand total of 95% of all Ubisoft games being played have been cracked versus purchased through legal stores and marketplaces. The list goes on, and on through nearly all major game and software companies like Adobe, EA Games, and many more.

So what does this mean for the software industry? Well, there are two very different stories here when we’re talking about companies trying to stay afloat. Smaller companies made up of no more than 5-10 members could very easily go bankrupt after their first or second release. Bigger companies with well-established products are a whole different story because they have access to major server banks and all the latest encryption. While smaller companies cannot afford to make massively multiplayer games, bigger companies have the resources available to make software online and thus not able to be pirated.

Here are some example of Online VS Locally run Softwares

Can be Pirated (Locally Run)Cannot be Pirated (Run Online)
General SoftwareMicrosoft OfficeGoogle Docs

Because the user must have a valid copy of the game these online games seem to be the perfect solution to the piracy problem, Right? Well, like I said earlier servers are expensive and small startups can’t possibly afford to produce games of the scale and complexity required to have an online setup. Meaning that this makes it extremely difficult for small game developers to be successful.

In the end, while piracy may be convenient now, it will sooner or later come back to haunt us when even the Major corporations still struggle to break even with the costs of producing such technological marvels.

What to see what the Secret Change in the Cracked version of Game Dev Tycoon? Watch this video or read this article.


My Top 5 Pet Peeves

We all have the one little thing, or maybe just that one moment that someone does something that just drives you nuts. I call these pet peeves and today I would like to share with you my top 5 pet peeves that I have seen repeatedly.

  1. “Console is better than PC”… I have heard this an endless amount of times from people who barely know the difference between a monitor and a computer. The fact is that most console games (including next gen) are toned down versions of the PC game. Another thing is that most people who tend to say this, have never played anything better than a free flash game.
  2. People getting MB confused with Mb, Believe it or not, there is a difference. MB stands for MegaByte but Mb stands for MegaBit, but because Megabytes are far more popular most people see that their internet gives them x number of Mb per second and think: “OMG I have like a download speed of 100 Megabytes a second”. Really? So you just downloaded that 30 Gigabyte file in 5 minutes? No, just No.
  3. People not setting clocks to the correct time after daylight savings time starts. I can understand if it takes you a few days to get all your clocks set to the new time but, if it takes so long that by the time you finally change your clocks it’s time to change them back again you waited too long. I have had people not change their clocks for literally months after the time change and they are always late or early for everything… Geez, I wonder why? *Sarcastically*
  4. Macs and iPhones are NOT impervious to viruses! The truth is that viruses are in circulation for almost 10 times longer on Macs and iPhones than on Windows PC. This I because of the idea that they are unable to get viruses which is very much untrue as the way I see it, whatever your good at there is someone else who is better (Unless you are literally the best hacker in the world which in that case, Hi plz don’t hack my accounts).
  5. People who walk on the road when the sidewalk is right there. Do you have a death wish? The other day I saw a guy walking on a 6 lane road when there was literally a sidewalk 6 feet away. Why? Just why would anyone do something so stupid? If you ask me Darwin’s survival of the fittest still words today, in the sense that people who do things like that are killed early on in life and the rest of us who know to not take unnecessary risks just keep on living.

Is Piracy a Problem?

Piracy is a huge problem online right now… or is it? Well before we can determine whether or not piracy is a problem I think we need to realize why 2/3 of all teens and 1/5 adults have pirated in their lifetimes. So why do people pirate? Well, for the most part, it’s simply because ease of it, you can get hooked up with some torrenting software, go to any of the many pirating sites out on the web and get movies, music, and software for completely nothing. Yes, it’s really that easy! So how exactly are electronic media companies dealing with this? Well as far as music and movies the scope of what publishers can do is very limited as once someone downloads a song as an mp3 or wav file, there is no encryption or serials that need to be bypassed and so that single song could spread across the web being downloaded and re-uploaded continuously. In this scenario, the author and publisher of the music or video are rendered powerless to stop the illegal downloading and forced to watch as millions of sales go down the drain.

On the other hand, the second most pirated industry would be software and games. These are much harder to pirate but still incredibly simple in most cases. Games and software come with something called a ‘Crack’ or a ‘Patch’ which is used to trick the software’s anti-piracy measures into thinking you’ve got a legit copy and serial. So what are software companies doing to stop this? Well, an example is Adobe’s creative cloud, which is a subscription-based service to allow middle-class people to afford Adobe products. The thought behind making a subscription-based adobe software platform was that if people could get the software’s for only $10-20 a month then pirating rates would go down. However nothing is perfect, and adobe missed one key feature; because they allow you to download the full version of any of their products for a free 30-day trial, it is as simple as switching the amtlib.dll file with a cracked version. This is because the trail is the full version but with a ‘Trial’ license attached to it, the amtlib.dll is the license file that tells the program if you have a trial or full version of the software. So swapping out the license file with a full version license will trick the program into thinking that you have the full license even if you had downloaded it from adobes own website. Another innovative idea that Rockstar North had when making the popular title GTA IV to combat piracy was to actually give people to games. More specifically two different .exe’s, the idea was when pirates looked to try to crack the game they would use GTA.exe when in reality the actual game was titled GTALauncher.exe. The GTA.exe included what is now called the ‘drunk cam’ making the game camera shake, but also to make the pirated version unplayable all cars in GTA.exe would speed up until they flew off the map and not allow the player out. Eventually, the pirates figured out how to fully crack the game using the GTALauncher.exe but it was a valiant effort from Rockstar North.

So as you can clearly see, so far there has been no permanent solution to completely clear a software or game from being torrented, and even the ones that do work don’t last long. But does all this really affect the software industry like they claim it does? Well, it really depends on the company, for small developers and studio startups this makes a huge impact and could mean the difference between success and failure. But for giant development corporations like Ubisoft and Adobe, it for now at least simply means a slightly lighter paycheck for the employees.

Now finally time to answer the question is piracy a problem? Yes, but No I think that saying the piracy is a standalone problem is incorrect. Overall the reason that most people pirate is because they simply can’t afford it; with our current economy which is less than stable at best most people simply don’t have the jobs to pay for today’s movies and software which can sometimes cost up to thousands of dollars.