Pete Freitag Pete Freitag

The 15 Essential UNIX commands

Published on July 29, 2005
By Pete Freitag
linuxmiscapple

Learning UNIX is a seemingly daunting task, there are thousands of commands out there, each with hundreds of options. But in reality you only need to know a few of them.

I use unix quite a bit, usually either on one of our Linux servers, or on my Powerbook with OS X. And these are the 15 commands that I use most. If you can memorize these 15 commands, you can do quite a bit on a unix operating system, and add unix as a skill on your resume.

The 15 Most Important UNIX commands

  1. man - show manual for a command, example: man ls hit q to exit the man page.
  2. cd - change directory, example: cd /etc/
  3. ls - list directory, similar to dir on windows. example: ls /etc, use ls -l /etc to see more detail
  4. cp - copy a file or directory, example: cp source dest if you want to copy a directory use the -R option for recursive: cp -R /source /dest
  5. mv - move a file, example: mv source dest
  6. rm - remove a file, example: rm somefile to remove a directory you may need the -R option, you can also use the -f option which tells it not to confirm each file: rm -Rf /dir
  7. cat - concatenate, or output a file cat /var/log/messages
  8. more - outputs one page of a file and pauses. example: more /var/log/messages press q to exit before getting to the bottom. You can also pipe to more | more from other commands, for example ls -l /etc | more
  9. scp - secure copy, copies a file over SSH to another server. example: scp /local/file [email protected]:/path/to/save/file
  10. tar - tape archiver, tar takes a bunch of files, and munges them into one .tar file, the files are often compressed with the gzip algorithm, and use the .tar.gz extension. to create a tar tar -cf archive.tar /directory, then to extract the archive to the current directory run tar -xf archive.tar to use gzip, just add a z to the options, to create a tar.gz: tar -czf archive.tar.gz /dir to extract it tar -xzf archive.tar.gz
  11. grep - pattern matcher, grep takes a regular expression, or to match a simple string you can use fast grep, fgrep failure /var/log/messages, I'm usually just looking for a simple pattern so I tend to use fgrep more than regular grep.
  12. find - lists files and directories recursively on a single line, I usually pipe grep into the mix when I use find, eg: find / | fgrep log
  13. tail - prints the last few lines of a file, this is handy for checking log files tail /var/log/messages if you need see more lines, use the -n option, tail -n 50 /var/log/messages you can also use the -f option, which will continuously show you the end of the file as things are added to it (very handy for watching logs) tail -f /var/log/messages
  14. head - same as tail, but shows the first few lines the file
  15. vi - text editor, there are several text editors such as emacs, and nano, but vi is usually installed on any server so its a good one to learn. To edit a file type vi file to edit a line press Esc i then to save changes and exit use Esc wq, or to quit without saving use Esc q!. There are a million other commands, but that will enable you to edit files at a basic level.

Once you learn these commands, and are comfortable with them, you shouldn't stop there, there are lots of other commands that can make your life easier.

Did I miss any commands that you think are essential to using a UNIX based operating system?



unix linux mac osx commands shell bash vi grep scp

The 15 Essential UNIX commands was first published on July 29, 2005.

If you like reading about unix, linux, mac, osx, commands, shell, bash, vi, grep, or scp then you might also like:

Weekly Security Advisories Email

Advisory Week is a new weekly email containing security advisories published by major software vendors (Adobe, Apple, Microsoft, etc).

Comments

I prefer less to more

i.e. $cat filename | less
OR you can do just $less filename

less allows you to scroll up and down, more is only one way scrolling.

another good one to keep in mind is ps

i.e. $ps -A (this shows all running processes)

chown and chmod are good to know too
by Dutch Rapley on 07/29/2005 at 1:31:54 PM UTC
Oh yeah, one more - ln comes in handy when creating links to files, you can create symbolic links with $ln -s [source] [destination]

symbolic links are a must when creating virtual hosts on Linux, b/c the Flash forms will not work without them.

Explanation

If youre running a web server, you will probably have it configured to run virtual hosts. This may create an issue, if youve decided to use CFMX 7s Flash forms. Flash forms reference files in the {virtual host webroot}/CFIDE/scripts directory. The CFIDE directory is located in the directory you specified upon installing ColdFusion MX 7. When you create a virtual host, the CFIDE directory doesnt exist for the virtual host. Here, there are two things you DO NOT want to do.

1. DO NOT copy the CFIDE directory to your virtual host directory. When coldfusion is upgraded, changes that may occur will not be implemented in files that your virtual host may reference. Copying CFIDE will also allow access to the CF administrator from your virtual host
2. DO NOT explicitly create a symbolic link to the CFIDE directory. This will allow access to the CF Administrator from a virtual host, which should be avoided at all costs.

The best solution is to create a CFIDE directory in your virtual hosts, and then create a symbolic link to the CFIDE/scripts directory. Follow the steps below:

1. go to the directory of your virtual host
2. $mkdir ./CFIDE
3. $ln s ./CFIDE ./cfide
4. $cd ./CFIDE
5. $ln s {path to default host}/CFIDE/scripts ./scripts

Note: By default, Apache may not resolve symbolic links. If your apache installation doesnt resolve symbolic links, youll need to make the change in the apache configuration file.
by Dutch Rapley on 07/29/2005 at 2:04:02 PM UTC
Don't forget to mention that commands can be chanined together.

For example:
tail -f foo.log | grep "[ERROR]" will stream a log file, showing only lines that have "[ERROR]" in them.

grep also has the -v option, that inverts the match. So:
grep -v "[ERROR]" will show you all lines that don't have "[ERROR]" in them.

Oh, and don't forget that on most linuxes, more is actually less, and vi is actually vim :D
by HKS on 07/31/2005 at 6:49:04 PM UTC
I haven't used Unix in years, but isn't chmod fairly critical?
by Patrick McElhaney on 08/01/2005 at 7:33:16 AM UTC
For RH boxes, "locate" may be a good addition. Can replace find for the quite and simple searches. Only downside is it requires a database for searches and may require updating multiple times a day. Still pretty quick once it is good to go.
By the way, how you liking the new server?
by Brad on 12/31/2005 at 4:22:17 AM UTC
Personally, I use 'locate' on a daily basis - both on my Linux machines and on my Windows box (via Cygwin).

Maybe my memory just isn't what it used to be, but having a quick way to find out where a file is often proves invaluable.

Following on Brad's comment (from 2 months ago), I update the locate database via cron job every morning @ 3am. That keeps the database current enough for my needs.
by Rob Wilkerson on 03/03/2006 at 11:31:29 AM UTC
Hello guys i am a new to this unix i want to know is there any command like than who am i other than whoami, please if any body knows please inform me.

thanks and regards
raja.p
by Raja on 09/05/2006 at 1:23:19 AM UTC
Look at "w" and "who". Each gives a variation of the same information. Is that what you're looking for?
by Rob Wilkerson on 09/05/2006 at 1:14:39 PM UTC
I'm looking for a means to 'copy 400 lines of a file into a new file beginning at line n'. Used for: a log where an exception was thrown... to move this data to a local file, share it with collegues or for quick reference. I thought a solution would include less -n ### > newFile.txt. Then scpy the new file to a local drive. But how do I get the # of lines to be limited to 400?
by duane on 01/03/2007 at 3:34:40 PM UTC
You can head (or tail) -n 400 to get the first (or last) 400 lines.
by tunaranch on 01/03/2007 at 6:53:34 PM UTC
Hi Everybody! Is there any command in Unix to Undo the effect of previous command. say, I do rm *.* on the root (!!) and then wish I could Undo this?
by sangramsinh.takmoge on 10/13/2008 at 1:30:46 PM UTC
Nope. "Undo" isn't really part of the Unix lexicon. The OS is generally optimistic and assumes that you know what you're doing.
by Rob Wilkerson on 10/13/2008 at 1:38:44 PM UTC
hello,
i want to run two commands in a single line. is there any command to run two commands in a single line. i don't want to merge two commands using | pipe. but i want to run both seperately.
thank you!
by amalan on 12/04/2008 at 6:37:38 PM UTC
what about $history command..? i think it shows only last 500 commands only but i want to see all over the commands,which are i typed still now...
by sudheer on 09/08/2009 at 3:48:17 AM UTC
In unix ps shows the process.if u want to show only process for perticuler term then u can use ps -t and every all process wih full listing you can use ps -aef .
by Deepak kr Sharma on 03/31/2010 at 10:06:33 AM UTC