Connecting to Belkin N150 in Ubuntu 10.10

belkin ubuntu10.10

I had bought a Belkin router few months back. The label on the router box said that it supports only Windows and Mac operating systems. I wondered why it can’t support Linux(Ubuntu) when it could sense a Mac. So I set out on a frustrating journey to find out how to make my Compaq Presario CQ50 dual booting Windows 7 and Ubuntu 10.10 to be able to detect my Belkin N150 router in Ubuntu.

First, I created a wireless connection in the Network Manager. It should be noted that the wifi button on my laptop never toggled between ON and OFF. It was always turned ON.The “Wireless Networks” option in Network Manager applet either read as “device not ready” or “wireless network disabled”. Moreover I was conveniently able to connect to wired Internet.

So I probed for the driver used by my system.

$ ethtool -i eth0 

My driver was r8169, which I downloaded from this[1] site.

I installed it as instructed in that downloaded package.

Next, I ran the following command to list all the wired and wireless connections available for my system.

$ ifconfig -a 

Surprisingly, it showed an entry “wlan0” for the wireless network, apart from “eth0” and “lo” for wired and loopback connection.

So, I thought of enabling the wireless connection using the following command:

$ sudo ifconfig wlan0 up 

But, that threw an error “SIOCSIFFLAGS: Operation not possible due to RF-kill” so I googled for solutions using that error message and I chanced upon this[2], this[3] and this[4] solutions.

Based on the instructions in those sites i did the following:

$ rfkill list 

the result was:

0: hp-wifi: Wireless LAN

Soft blocked: yes

Hard blocked:no

1: phy0: Wireless LAN

Soft blocked: no

Hard blocked: no

Now we would reach the solution if we set “Soft Blocked” to “yes” for hp-wifi.

So i did:

$ sudo rm /dev/rfkill 

then restarted laptop and then again in the terminal:

$ sudo rfkill unblock 0 

or

$ sudo rfkill unblock wifi 

and then finally,

$ sudo ifconfig wlan0 up 

Bingo! it worked. The wireless networks were now detected by my ubuntu 10.10

[1] http://goo.gl/4SUAn

[2] http://goo.gl/Ubpc6

[3] http://goo.gl/NmWWz

[4] http://goo.gl/oy3We

WGET to get them all from the web!

You might have come across the famous tech joke where an Internet newbie  would ask: “I want to download the Internet. Do I need a bigger hard disk?” LOL!
Well there might not be any software or command currently available to download the entire Internet, but we do have a true awesome command in Linux that can download an entire website!!! Surprised? Then read further.

WGET is a funtastic command available in all Linux distros and it could be customised(parameterised) to download an entire website,or a part of it, as well files from the Internet. Simply put, Wget is to Linux what IDM is to Windows(I have feel that I am overstating the power of IDM). Let me show you how wget could do wonders!

Lets hit the bulls eye first.

$ wget    https://getch.wordpress.com

The above command downloads the homepage of getch.wordpress.com(index.html). It is saved in the current working directory.

So now you want to download, not only the homepage but also the entire site. In other words, you may want to recursively download all the content linked in  my blogs homepage. So we will supply the recursive parameter(r) to the wget command.

$ wget   -r  -p  https://getch.wordpress.com

This means that  you also get all pages (and images and other data) linked on the front page.The parameter p tells wget to include all files, including the images. This would make the downloaded files look as they would be online.

Some sites would try to block wget requests reasoning that  it doesn’t originate from a browser. So we will disguise the wget accesses to websites to make it appear as though it originates from a browser like Firefox. This is how you should do it:

$ wget  -r   -p   -U   Mozilla    https://getch.wordpress.com

-U Mozilla does the trick here.

In order that you may not be blacklisted for running wget over a site, pause for 20 seconds in between retrievals and set the download speed accordingly.

wget – –wait=30 – –limit-rate=50K  -r   -p   -U   Mozilla https://getch.wordpress.com

Here wget waits for 30 seconds between retrievals and the download rate is confined to  50KB/s.
What if you want to pause in between downloads. Yes there is a solution for that too. Use parameter c.

$ wget  -c   http://ubunturelease.hnsdc.com/maverick/ubuntu-10.10-desktop-i386.iso

Here we are trying to download Ubuntu Linux which sizes to 700MB, So if you ever had to interrupt the download, running the above command again will let you resume from where you stopped(paused) the download.

If you want to get things done under the hood use -b.This parameter performs the download in the background so you can just take care of the other tasks.

I am lazy to enter each url every time I need to download one. So I would just enter all those URLs once in a text file and provide it as an input to the wget command so that I could just sit back and have a cup of coffee.  Now I need not download it each time, “i” would do it.

And to the final tip!
Mirror an entire website for offline reading.The format is:
$ wget – –mirror   -p – –convert-links   -P  ./LOCAL-DIR WEBSITE-URL
and to quote an example
$ wget – –mirror   -p – –convert-links    -P  /home/manojkumar  https://getch.wordpress.com

– – mirror : turn on options suitable for mirroring.

-p : download all files that are necessary to properly display a given HTML page.

– -convert-links : after the download, convert the links in document for local viewing.

-P ./LOCAL-DIR : save all the files and directories to the specified directory.

Windows has a costly alternative to wget, that could do a  part of what wget does.Teleport Software for Windows lets you download the entire site for offline browsing.For more details  and purchasing options visit: http://www.tenmax.com/teleport/

Kdenlive: The video editor of my choice

I just grew interested in uploading my videos to Facebook and Youtube. So I set searching for video editors that I wanted to rip a particular portion of my video and convert it into an uploadable form. Believe me, I spent a day to finally settle down with a good editor. I started out with the Windows platform(Win7). First, I chose the traditional Windows Movie Maker. It didnt work out. My .avi  video had some encoding error so Movie Maker did not recognise it. Then I tried Free Video Dub, which had a very simple interface, but a lot of patience was needed to edit videos. It crashed often in my Windows 7. I don’t know whether it was a compatibility error. Turning on the compatibility option(right click on exe–>properties–>compatibility tab) did not work either. Googling around I found this Pinnacle Studio. It seems to be a famous editor out there, but then it was large in size, took some time to load and crashed often. So it was time to say goodbye to Windows and switch to Ubuntu:) But the pastures werent much greener there initially. I chose the editor that was the choice of many: Pitivi. It is a neat app for Linux, but I found it difficult to edit the video strip on the timeline. Phew! I was almost about to quit, when I heard about Lives and Kdenlive from the Ubuntu community forums. I installed them straight from the command line without any hassle. Lives interface was not user friendly, so I will quit talking about that. Finally, I settled down with Kdenlive. It is a very user friendly app for Ubuntu. It looked similar to Windows Movie Maker, but was more easy especially the timeline editing. I was able to finish my task in 30minutes. I rendered my project as  a .avi video. The size of the resulting video turned out to be 814+ MB, a size that would take long to get uploaded in Facebook, considering my 512kbps Internet. So I decided to convert it into a good quality flv video. First, I tried WinFF, based on FFMPEG. The resulting flv video was 14+ MB in size but had no audio. So I took recourse to Windows. Koyote Free Flv converter did the job neat and at last I successfully uploaded <this> video. so next time you sit down on a video editing project, let Kdenlive be your first choice 🙂

Additional:

You may also be interested in this video on LaTex 

Customising Gedit for tcl scripting

I spent a whole evening trying to find out some IDE or editor that would help me in coding tcl scripts as a part of ns simulations. The major feature I expected was autocompletion and keyword highlighter. Every editor/IDE that I stumbled upon contained one or the other, but not all. I tried Komodo Edit, Visual TCL,Alpha(tk) etc. Some of these even tested my patience in installing them.

Frustrated, I took recourse to Gedit and by chance I noticed that there are certain plugins available for Gedit, which I found would help satisfy my requirements I had mentioned in the para above. One notable plugin is the “Autocompletion” plugin. While you type the code/some word, this plugin shows a popup of words which were used before in the same document. In simple, it “autosuggests”. This one was interesting.

Along with this, you can enable some more plugins like “Bracket Completion”,”Embedded Terminal” and “Session saver”, whose purpose is pretty much self explained. And thus you almost have an IDE customised to code tcl or almost any language, needless to say that gedit provides better code highlights.

This is where you find the plugins….

Gedit–>Edit–>Preferences–>Plugins

For more on installing gedit plugins in Ubuntu, see my previous post.

<Here> is a complete list of Gedit plugins and don’t miss out the Latex plugin 🙂


********************************************************************

~TiP~

Move Window Buttons Back to the Right in Ubuntu 10.04 / 10.10

********************************************************************