It’s that time of year again.
Dread going to the post office or even asking the IRS to send forms to you. Going to an IRS office can be a hassle. You can avoid the long lines and the demeaning IRS representatives. All you need is a free program called Wget. It is available for the most popular platforms.
More information here: http://www.instructables.com/id/Getting-IRS-forms-with-wget/
We have not trouble about backing up our personal computers. What do we do about what we have on-line? I found a fairly easy solution for one site called Blogger.com. Guess what is that good old wget command. You could download a site with:
$ wget -c -r http://www.whateverthesitenameis.com
The command will recursively get all of the website it can and stay on top off getting the site till it is done. So then (if your computer supports long file names), you would have a directory called http://www.whateverthesitenameis.com. Cool. Then I thought about it and wanted to go one step further. That is directly put the files on another local server so I could access them at will. Also did not want to interfere with the existing web pages. So we need an Apache web server that supports virtual hosts.
On that server where you have admin rights, you want to set up your web directory such as:
$ cd /var/www
$ sudo mkdir -p http://www.whateverthesitenameis.com/html
then let’s get the files.
$ sudo wget -c -r http://www.whateverthesitenameis.com
Now we need to let the server know about the new site. We need to come up with a site name that is not either used or one you can use with a viable dns address. Your network will probably use the net address first and not find your local site, use an unused url and put it in your local dns with the local servers ipaddress. Let’s substitute a new name such as http://www.mysitebackedup.com. We will need to make a file with that name on the apache server in the sites-available directory in the Apache2 file structure. i.e.
$ sudo nano /etc/apache2/sites-available/www.mysitebackedup.com
</p> <p><VirtualHost 192.168.1.61:80></p> <p>ServerName www.mysitebackedup.com</p> <p>ServerAlias www.mysitebackedup.com</p> <p>ServerAdmin firstname.lastname@example.org</p> <p>DocumentRoot /var/www/www.whateverthesitenameis.com/html</p> <p>The you need to make a link to the that file in the sites enabled directory.</p> <p></VirtualHost></p> <p>
$ cd /etc/apache2/sites-enabled/
$ ln -s ../sites-available/www.mysitebackedup.com
Then you need to restart the server to let it have the updates and the details of the new site.
$ sudo service apache2 restart
You should be able to reach your local site now if the local dns has been updated for the new site. I did have to disable use proxy in my Firefox to see the site.
Any time you need to update the site, just go back to the same directory and run the wget command again.
Newsreaders can be a real time saver. Dort of your own personal newspaper. Do you go to a menagerie of websites. Would it not be nice to have all the content you usually look at all in one place. That is what a newsreader will do for you. One that I use, but there are better ones is Liferea for newsfeeds.
Of course when you first start, there will be a minimum items to read. You will want to add more sites to your reader. that can be done by getting the rss link on the web page. If that does not work try inserting the url of the site in the feeder.
You can also save and load what is known as opml files. That means you can share news feeds with others.. We do a lot of command line, so I like to be able to read the text from web. One program to do that is Newsbeuter. In fact you can import the opml file from your gui based news reader very easily.
$ newsbeuter -i feedlist.opml
Import of feedlist.opml finished.
Baking soda – the magic kitchen powder. Be sure and vote for the instructable.