Connecting to Belkin N150 in Ubuntu 10.10

belkin ubuntu10.10

I had bought a Belkin router few months back. The label on the router box said that it supports only Windows and Mac operating systems. I wondered why it can’t support Linux(Ubuntu) when it could sense a Mac. So I set out on a frustrating journey to find out how to make my Compaq Presario CQ50 dual booting Windows 7 and Ubuntu 10.10 to be able to detect my Belkin N150 router in Ubuntu.

First, I created a wireless connection in the Network Manager. It should be noted that the wifi button on my laptop never toggled between ON and OFF. It was always turned ON.The “Wireless Networks” option in Network Manager applet either read as “device not ready” or “wireless network disabled”. Moreover I was conveniently able to connect to wired Internet.

So I probed for the driver used by my system.

$ ethtool -i eth0 

My driver was r8169, which I downloaded from this[1] site.

I installed it as instructed in that downloaded package.

Next, I ran the following command to list all the wired and wireless connections available for my system.

$ ifconfig -a 

Surprisingly, it showed an entry “wlan0” for the wireless network, apart from “eth0” and “lo” for wired and loopback connection.

So, I thought of enabling the wireless connection using the following command:

$ sudo ifconfig wlan0 up 

But, that threw an error “SIOCSIFFLAGS: Operation not possible due to RF-kill” so I googled for solutions using that error message and I chanced upon this[2], this[3] and this[4] solutions.

Based on the instructions in those sites i did the following:

$ rfkill list 

the result was:

0: hp-wifi: Wireless LAN

Soft blocked: yes

Hard blocked:no

1: phy0: Wireless LAN

Soft blocked: no

Hard blocked: no

Now we would reach the solution if we set “Soft Blocked” to “yes” for hp-wifi.

So i did:

$ sudo rm /dev/rfkill 

then restarted laptop and then again in the terminal:

$ sudo rfkill unblock 0 


$ sudo rfkill unblock wifi 

and then finally,

$ sudo ifconfig wlan0 up 

Bingo! it worked. The wireless networks were now detected by my ubuntu 10.10





WGET to get them all from the web!

You might have come across the famous tech joke where an Internet newbie  would ask: “I want to download the Internet. Do I need a bigger hard disk?” LOL!
Well there might not be any software or command currently available to download the entire Internet, but we do have a true awesome command in Linux that can download an entire website!!! Surprised? Then read further.

WGET is a funtastic command available in all Linux distros and it could be customised(parameterised) to download an entire website,or a part of it, as well files from the Internet. Simply put, Wget is to Linux what IDM is to Windows(I have feel that I am overstating the power of IDM). Let me show you how wget could do wonders!

Lets hit the bulls eye first.

$ wget

The above command downloads the homepage of It is saved in the current working directory.

So now you want to download, not only the homepage but also the entire site. In other words, you may want to recursively download all the content linked in  my blogs homepage. So we will supply the recursive parameter(r) to the wget command.

$ wget   -r  -p

This means that  you also get all pages (and images and other data) linked on the front page.The parameter p tells wget to include all files, including the images. This would make the downloaded files look as they would be online.

Some sites would try to block wget requests reasoning that  it doesn’t originate from a browser. So we will disguise the wget accesses to websites to make it appear as though it originates from a browser like Firefox. This is how you should do it:

$ wget  -r   -p   -U   Mozilla

-U Mozilla does the trick here.

In order that you may not be blacklisted for running wget over a site, pause for 20 seconds in between retrievals and set the download speed accordingly.

wget – –wait=30 – –limit-rate=50K  -r   -p   -U   Mozilla

Here wget waits for 30 seconds between retrievals and the download rate is confined to  50KB/s.
What if you want to pause in between downloads. Yes there is a solution for that too. Use parameter c.

$ wget  -c

Here we are trying to download Ubuntu Linux which sizes to 700MB, So if you ever had to interrupt the download, running the above command again will let you resume from where you stopped(paused) the download.

If you want to get things done under the hood use -b.This parameter performs the download in the background so you can just take care of the other tasks.

I am lazy to enter each url every time I need to download one. So I would just enter all those URLs once in a text file and provide it as an input to the wget command so that I could just sit back and have a cup of coffee.  Now I need not download it each time, “i” would do it.

And to the final tip!
Mirror an entire website for offline reading.The format is:
$ wget – –mirror   -p – –convert-links   -P  ./LOCAL-DIR WEBSITE-URL
and to quote an example
$ wget – –mirror   -p – –convert-links    -P  /home/manojkumar

– – mirror : turn on options suitable for mirroring.

-p : download all files that are necessary to properly display a given HTML page.

– -convert-links : after the download, convert the links in document for local viewing.

-P ./LOCAL-DIR : save all the files and directories to the specified directory.

Windows has a costly alternative to wget, that could do a  part of what wget does.Teleport Software for Windows lets you download the entire site for offline browsing.For more details  and purchasing options visit:

Kdenlive: The video editor of my choice

I just grew interested in uploading my videos to Facebook and Youtube. So I set searching for video editors that I wanted to rip a particular portion of my video and convert it into an uploadable form. Believe me, I spent a day to finally settle down with a good editor. I started out with the Windows platform(Win7). First, I chose the traditional Windows Movie Maker. It didnt work out. My .avi  video had some encoding error so Movie Maker did not recognise it. Then I tried Free Video Dub, which had a very simple interface, but a lot of patience was needed to edit videos. It crashed often in my Windows 7. I don’t know whether it was a compatibility error. Turning on the compatibility option(right click on exe–>properties–>compatibility tab) did not work either. Googling around I found this Pinnacle Studio. It seems to be a famous editor out there, but then it was large in size, took some time to load and crashed often. So it was time to say goodbye to Windows and switch to Ubuntu:) But the pastures werent much greener there initially. I chose the editor that was the choice of many: Pitivi. It is a neat app for Linux, but I found it difficult to edit the video strip on the timeline. Phew! I was almost about to quit, when I heard about Lives and Kdenlive from the Ubuntu community forums. I installed them straight from the command line without any hassle. Lives interface was not user friendly, so I will quit talking about that. Finally, I settled down with Kdenlive. It is a very user friendly app for Ubuntu. It looked similar to Windows Movie Maker, but was more easy especially the timeline editing. I was able to finish my task in 30minutes. I rendered my project as  a .avi video. The size of the resulting video turned out to be 814+ MB, a size that would take long to get uploaded in Facebook, considering my 512kbps Internet. So I decided to convert it into a good quality flv video. First, I tried WinFF, based on FFMPEG. The resulting flv video was 14+ MB in size but had no audio. So I took recourse to Windows. Koyote Free Flv converter did the job neat and at last I successfully uploaded <this> video. so next time you sit down on a video editing project, let Kdenlive be your first choice 🙂


You may also be interested in this video on LaTex 

Embedded Terminal in Gedit

Gedit is one of the finest editors available for Linux.

I have been using it for quite sometime and has always wished that it contained an embedded terminal similar to the one in Kate. Little did I realise that it is available in default Gedit installations, until I stumbled upon a blog post on that topic. All you need to do is to enable the Terminal  plugin in  Gedit Preferences. Still there was a catch. Ubuntu did not list “Embed Terminal” option in its plugin list. Searching for a while in Google gave me the following solution.

Install the missing plugin options …

$ sudo apt-get install gedit-plugins

Thats all!

Now you can find “Embed Terminal” option in

Gedit—> Edit—>Preferences–>Plugins.

Check your choice, then enable display of bottom pane(View–> Bottom Pane or just ctrl+f9).

Bingo!! There you see the terminal now. Now you can write code and compile it easily!

More interesting tips <here>

Was this post helpful? Then thank me by clicking <this link>

Installing Network Simulator(ns2.34) in Ubuntu10.10

Download the following installation steps as a pdf manual <here>. This manual also contains the steps to install ns2.34 in Fedora 12.

Read this article in Scribd <here>


The nightmare days are over!

Now you dont have to spend hours on your linux distro(ubuntu) installing and validating ns2.x

Ubuntu 10.10 has greatly simplified things. You can now install ns,nam and xgraph by just a single command in the Terminal:

 $ sudo apt-get install ns2  nam xgraph 

You will be prompted for the user password. Enter it and watch Ubuntu 10.10 do the things for You!

Happy Ubuntuing 😉


On heeding to my  blog visitors requests, I would like to update this post with the steps to install ns2.34, the traditional way.

Step1: Download ns-allinone-2.34 package from this <site>. I will be using ns version 2.34.

Step2: Place the ns-allinone-2.34.tar.gz file in your home folder(/home/micman in my case). Right click on the package and extract the contents in the same home folder.

Step3: Next, open the Terminal(Applications–>Accessories–>Terminal in ubuntu)

Step4: Change to ns-allinone-2.34 directory

 $ cd /home/micman/ns-allinone-2.34

Step5: First, Install the dependencies

  $ sudo apt-get install build-essential autoconf automake libxmu-dev gcc-4.3 

Note that we are downgrading the gcc version, as ns2.34 works well with gcc4.3

Edit found at this location ns-allinone-2.34/otcl-1.13/ as follows:

Find the line that says:
CC= @CC@
and change it to:
CC= gcc-4.3

Step6: Begin ns2.34 installation

 $ sudo su


  # ./install 

Step7: Once the installation is successful i.e. without any errors, we need to add the path information to the file ~/.bashrc

  $ gedit   ~/.bashrc 

Step8: Append the following lines to the file ~/.bashrc






# Note: the above two lines starting from XGRAPH should come in the same line


Here replace /home/micman with the path to your home folder.

Step9: For the changes to take effect immediately, do the following:

$ source      ~/.bashrc 

Thats all!

type ns to see % and type nam to show the nam startup window. This shows your installation has been successful.


  • If the tk compilation failed, especially for tk3d.c, make sure you have installed libx11-dev package.
  • If the otcl configuration failed, make sure you have installed x-dev and xorg-dev packages.


nsbash: ns: command not found


This could be because you have not set the $PATH variable. Therefore, the OS does not know where to look for the command “ns“. more detailed solution <here>


Segmentation Fault


found here


Undefined reference to vtable


found here

This article is sponsored by Evansys Technologies.

Was this post helpful? Then thank me by clicking <this link>

You may also be interested in: