What is Web 2.0?

I ran across this great video titled “Web 2.0 … The Machine is Us/ing Us.” by Michael Wesch, Assistant Professor of Cultural Anthropology at Kansas State University that presents the concept of Web 2.0 in a very interesting and informative way.


Insecure Protocols, Passwords, Ettercap, and PHP

So I was doing my daily web browsing looking for something cool, when I came across something that for the first time really hit home how completely unsecured the Internet is and how simple it is for people to grab your passwords.

Now, I’ve been on/using the Internet in one shape or form since 1987, and have seen things migrate from the nearly open educational network that the amazing and scary thing that it is today. The problem lies in the fact that a lot of the Internet still uses the same insecure protocols that were popular back in 1987; in particular POP/IMAP email authentication along with basic HTML and others.

I’m an IT professional (granted I’m in management now, but still …) but I was taken back by how utterly easy it is to now capture username and password information. So your thinking no big deal, I use a “secure” website to access my online banking, etc. Well do you also use a different password for every site you visit?

What really hit home for me was this post about the Wall of Shame on the Irongeek site. What the site has is some very simple PHP to display the username/password combination sniffed out by Ettercap.

Ettercap is a suite for man in the middle attacks on LAN. It features sniffing of live connections, content filtering on the fly and many other interesting tricks. It supports active and passive dissection of many protocols (even ciphered ones) and includes many features for network and host analysis.

Ettercap is an amazing piece of software with many features. The main site only offers the source code, so knowing that I needed to run the Irongeek’s page on my Linux box, I started to compile the ettercap code. My box was missing one of the libraries and seeing as how I only wanted a proof of concept anyway. I downloaded a precompiled Windows binary. Within about 2 minutes I had everyone up and running and I was capturing the username and passwords of the traffic crossing my network. Wow, that’s really scary.

Today’s little 30 minute diversion is going to really cause me to change the way I use the Internet, and secure my password information. I’m going to look at SSH/SSL tunneling for all my data when I can’t use a secure stream.

Yahoo! vs. Google

A simple visualization that compares the order of the search results from Yahoo & Google. You can try your own search strings at yahoo! vs. google via infosthetics

Updated Layout

Thanks to the server outage, I’ve had plenty of time tonight to upgrade my entire site. I’ll post later what I did and all of the changes that I made (and the fixes that I had to hack together). There are still a couple of things that I want to do and/or correct. I’ll try to get them all working shortly, in the meantime if you notice anything not working just drop me a line.

I’m now running WordPress 2.01 and the K2 Beta2 theme (links are at the bottom of the page).

Bad GigaBot! Down Boy, Down

OK, so you’re probably thinking this is a post about some new type of robot that has gone crazy, and in a way you would be right. The recent upgrade (still in progress) to site requires a significant increase in the amount of CPU cycles. Normally this would not be much of a problem, but I noticed that my box was working extra hard last night and getting quite hot to boot. When I went looking, I did find a robot that was hammering my site.

GigaBot is the web spider for the GigaBlast web search site. It was not playing very nicely. I don’t know for how long it has been hitting my server, but so far this month it had downloaded 450MBs of data. After a little bit of research, I updated my robots.txt file to tell GigaBot to not search my site. Once it finally got that change and stopped hitting my site, the machine returned to normal.