Monday, October 26, 2009

10 million servers- where are we heading to?

"Google envisions 10 million servers" says Google's Jeff Dean.

As a system user and a guy with passion for large and powerful systems, the statement fascinates me to know more about the network, the electricity, the storage capacity especially the amout of time it takes to carry out a computation. More important, I would be interested in the size of the hardware maintenance team. Isn't that interesting?

The same news would pop up different set of questions for other people(with no computer background).
A person from India would immediately think of the power consumption and the bandwidth required for the data centers.

The actual question remains(at least to me):
1. Do we need such huge computing power?
2. Where are we finally heading in this 'competition for success,name,power' ?
3. What is the situation of the nature with 10 million servers dissipating lot of heat?

Is Google thinking of Technology and its effect on Nature in the long run?
Is 'Technology' making our lives easy or difficult?

No doubt, this is highly debatable. :)

Update: You can find the presentation here
Thanks Sandeep for the link :)

Nobel Peace Prize

Nobel peace prize to be awarded to newspapers and TV channels(excluding spiritual and mythology, NGC and cartoon channels) that do not report about a death, mishap or accident of any type for 24 continuous hours.

And the peace prize authorities can, without any fear, amend the rules. For sure, no TV channel or newspaper would be eligible :P

What do you say ? [;)]

Tuesday, October 13, 2009

Aria2 Project- Update

Aria is good. Was able to download 653MB in 2hours 28 minutes.
I have used only one option which specifies the number of threads to be used.

Now should try exploring other options as well.

Aria2 Project- Lightweight multi-protocol download utility

There are many download accelerators available out there for windows (Free Download Manager, Download Accelerator) that help in managing bulky downloads.

Wget, no doubt is one of the most widely used utility. Given the volume of the file for my task, wget is not suitable.

Recently have come across aria2 project. Features:

  • light-weight
  • multi-protocol
  • multi-source
  • can be operated in command-line
  • HTTP/HTTPS, FTP, BitTorrent( :) ) and Metalink.
  • built-in XML-RPC interface. Can be manipulated aria2 via XML-RPC interface.
  • Runs on Linux, FreeBSD, Mac OS X and Windows.
Hmmm.....That's a decent list of features to atleast motivate me for trying it out.
Lemme(aka Let me( expanded for the crawlers :P)) see how useful this turns out to be.

BTW: Previously, I was using aget, a multi-threaded accelerator that supports HTTP downloads. And yippie that helped me in filling my HD with many useful lectures and files but this was limited to only HTTP. :(
Adding to this the server guys have blocked usage of aget. :(

PS: Here is a small list of available informal contractions.
If you are more interested in the linguistic aspects of contractions and their limitations in usage please read this. I love this article by Browning.
Here is another by Radford.

OK, Thats too much a deviation from the topic and lemme stop here. ;)

Thursday, October 8, 2009

SVOX

Came to know about a job posting related to developing parsers etc on linguist list.

Company: SVOX
address: www.svox.com
Location: Munich, Germany.
Areas: Speech Dilog, TTS, Speech related software.

How To Write A Scientific Paper

Ahh.... don't misunderstand and I am not here to teach any guidelines that fit the title of the post. ;)

The article is from Improbable Research, research that makes people laugh and then think.

This is the link for the article.

Hmmm... yeah the article had stuff that left me thinking :)
A nice read.

Wednesday, October 7, 2009

Linux renew ip address - Force DHCP

We were struggling to bring the network up on Ubuntu machine (using DHCP).
We tried using the commands ifdown and ifup but they were unable to bring network up.

The whole problem was with the IP. The client was not releasing the current IP.
The solution is to release the current IP and renew a new one.

Steps:
1. Release the current IP.
2. Obtain a new IP.
3. Restart the network.

Open the terminal and type the following commands

1. Release the current IP
[praneeth@inferno]$ sudo dhclient -r

2. Obtain a new IP
[praneeth@inferno]$ sudo dhclient

3. Restart the network
[praneeth@inferno]$ sudo ifdown eth0
[praneeth@inferno]$ sudo ifdown eth0

OR

[praneeth@inferno]$ sudo ifdown eth0

OR

[praneeth@inferno]$ /etc/init.d/networking restart



That's it , the network should be up and running :)