neosapien header image

Tuesday, March 31, 2009

Funny/Geeky/Weird and Grim pages in Wikipedia



Stumbling through Wikipedia, I've found some pages I like to revisit from time to time. The list of the pages grew bigger, until I decided to share it. What are your favorite pages from Wikipedia? :)


Real-time statistics

Want to figure out what's hot now on Wikipedia? This page: http://en.wikipedia.org/wiki/Wikipedia:Popular_pages has it :)

Grim statistics

Pages like these give me creeps under my skin. Watch deaths as they occur in real time - htpp://en.wikipedia.org/wiki/Deaths_in_2010

Users

I find this user page funny: htpp://en.wikipedia.org/wiki/User:Jesus_Lover

Bots

Love the design of control panel of this bot (by the way, he's also the most active bot on Wikipedia): htpp://en.wikipedia.org/wiki/User:SmackBot
The big red button is just amazing:)

Wikipedia's "official fun pages"

htpp://en.wikipedia.org/wiki/Category:Wikipedia_humor
htpp://en.wikipedia.org/wiki/Wikipedia:Department_of_Fun

Wikipedia's "official weird pages"

http://en.wikipedia.org/wiki/Wikipedia:Unusual_articles

Gambling

Wikipedians gamble by making bets on the day something will happen. Like, 666,666th article - htpp://en.wikipedia.org/wiki/Wikipedia:666,666th_pool
All wikipedia pools: htpp://en.wikipedia.org/wiki/Wikipedia:Pools
htpp://en.wikipedia.org/wiki/Wikipedia:Last_topic_pool

Wikipedia also has a feeling of self-importance

htpp://en.wikipedia.org/wiki/Wikipedia:Awareness_statistics
htpp://en.wikipedia.org/wiki/Wikipedia:Most_referenced_articles
htpp://en.wikipedia.org/wiki/Category:Wikipedia_charts

As well as wikipedians

List of wikipedians by number of edits - htpp://en.wikipedia.org/wiki/Wikipedia:List_of_Wikipedians_by_number_of_edits

And even their bots :)

List of wikipedia bots by number of edits - htpp://en.wikipedia.org/wiki/Wikipedia:List_of_bots_by_number_of_edits

Web-comics

Want to know which webcomics are recognized and cool?
htpp://en.wikipedia.org/wiki/List_of_self-sufficient_webcomics


Hexakosioihexekontahexaphobia

"Just as many hotels have no thirteenth floor (to the great dismay of several late bungee jumpers), Wikipedia will skip the 666,666th article out of systematically biased prejudice. Or threat of lawsuits from extreme religionists. Radiant 12:50, Jun 10, 2005 (UTC)"

The fear of number six hundred sixty six is called Hexakosioihexekontahexaphobia, and also has it's page on Wikipedia: htpp://en.wikipedia.org/wiki/Hexakosioihexekontahexaphobia

Vandalism

Vandalism in Wikipedia happens quite often (hence so many anti-vandal bots exist). However, it's not always performed because of evil intent. Sometimes it is made for lulz :)
Funny Wikipedia Vandalism - htpp://www.imserba.com/forum/archive/index.php/t-94521.html
Funny Wikipedia Vandalism about government: htpp://en.wikipedia.org/w/index.php?title=Government&diff=118335356&oldid=118335061


Extras

While writing the article, I also found some other nice leads which I would also like to share.
Meet Fucking, a village in Austria: http://en.wikipedia.org/wiki/Fucking,_Austria/
What makes it even more fun, is that this village is only half an hour away from Kissing and Petting :)



You may also like: Best 15 Metal covers to pop songs (with originals and videos)

Tuesday, March 24, 2009

Duplicate title tags, part 2

Hello everyone,

I'm still experimenting on Duplicate title tags/duplicate meta descriptions issue in Blogger.

I've roughly described it in my Tweets:

Eterniel:Just did some changes to my website template. Hopefully they will resolve most of the Duplicate Title Tags issues.

Eterniel:The biggest problem in this issue, of course, is blogger

Eterniel:It generates unneeded '?showcomment=' urls, which are then interpreted by Google Webmaster Tools as distinct pages

Eterniel:Furthermore, Blogger exports these links in /feeds/comments/default feed

Eterniel:So you have to throw away 'all-head-content' includable and replace it with your own.


I'm quite sure this will resolve all the issues. However, it takes quite some time while Google re-crawls the whole site.

I'm quite impatient, so I will use one trick. Don't pay attention to the following lines, that's just some too complicated robot SEO magic:


<!-- **robots -->
<script xmlns='http://www.googlebot.com/1999/gsa-crawler' xmlns:meta='http://www.googlebot.com/1999/gsa-crawler/meta' >
<user-agent name="Mediapartners-Google">
<page>
<uri>http://cranked.me/2008/04/but-i-figure-if-it-isnt-little-scary.html</uri>
<index>1</index>
<follow>0</follow>
</page>
<page>
<uri>http://cranked.me/2008/04/reason-i-dont-shop-online.html</uri>
<index>1</index>
<follow>0</follow>
</page>
</user-agent>

</gsa:script>
<!-- /**robots -->