Quick hit for today: I unfortunately find myself spending a little tiny bit too long surfing some pretty useless websites during the hours when I am supposed to be "enhancing shareholder value". I have a personality that is best controlled when the bad options are simply taken away from me, and the temptation is non-existent.
To that end, here is what I've done to keep my surfing time minimized (this is for my Mac, mind you. Doing this on Windows is certainly possible, but you'll need to use the Task Scheduler in place of the cron job).
- Switch to full root user mode with
sudo su -
. This just means less keystrokes. You are now in full-on root mode here though, so if you don't know what you're doing, skip this step and put "sudo" in front of all the subsequent commands. - Copy the existing /etc/hosts file twice:
cp /etc/hosts /etc/hosts.work
cp /etc/hosts /etc/hosts.home
The 'hosts.work' file will be the 'restricted' file that will keep you from visiting bad sites. - Open the /etc/hosts.work file in a text editor and add in a line for each of the websites you want to restrict access to like this:
127.0.0.1 www.facebook.com - Create (or modify) the root cron file using
crontab -e
. Cron is a task scheduler that is always running, and will execute whatever you add into it. In this case, we're going to swap the hosts file for the length of the workday. - At 9:00 a.m. from Monday to Friday, copy the restrictive "work" hosts file in:
# Move to the restrictive hosts to keep users off waste sites for the workday
0 9 * * 1-5 root cp /etc/hosts.work /etc/hosts - After 5:00 p.m., return the free "home" hosts file:
# Move the hosts file back to full access at the end of the workday
0 5 * * 1-5 root cp /etc/hosts.home /etc/hosts
That's it -- just like magic, any requests between 9-5 on Mon-Fri made to one of your 'restricted' sites will return a 'page load error', making sure that you and I stay on track.