Newbie Get Started and Migrate to Linux Tip by Tip

[I have re-located this article to this page, from its original location among my webpages. This article was most recently updated c. July, 2013  –L.L. ]

A Linux Chartreuse Paper    by  Lamebrain Lucas

Personal, Home-User/Social-User and Moby Linux for the Rest of Us

Motto of this Blog:
“I have tried to make answers to End-User, Consumer Linux questions that are conceptually clear, and digestible for those of us who do not have Computer Science degrees.”  —Lamebrain Lucas
Co-lateral Motto(es):
*  “The only way the People may keep themselves from being enslaved by technology, is to have their * own * Technology” *.
*  “A Mad Scientist Linux site”  *
NOTE that this writing is squarely aimed at Linux * DESKTOP * usage.  By this, I mean “a Linux-based Desktop Operating System”.  You can acronym this “LBDOS” if you want:  my usage in what follows, will simply be “desktop Linux”.  And this for casual, social, and SOHO (Small Office/Home Office) users of an operating environment on laptops, towers, netbooks, and tablets.
* Desktop * computing, if you didn’t already know, usually refers to a “graphical environment”, like good ol’ Windows XP.  Point-and-click.  At least that’s pretty much it, boiled-down.  Further, this writing is aimed at amateurs (like myself), and new users.  Those in search of information as to Linux   * servers *, or embedded-systems, should probly keep searching—though I have tried to incorporate some amount of general Linux knowledge here, which many people (it is to be hoped) may find useful.
Please NOTE:  at the time this project was begun,  LBDOS (Linux-based Desktop Operating System) popularity was indeed increasing.  To-day, at the time I am actually getting around to posting this information online (Finally!), it is truly debatable whether or not the Linux desktop will become really popular—at least within the space of the next several years.  Yes, there * are * still a lot of us Linux-users out there (and * contributors * to Linux, too!).  But at the time I actually am able to upload this article, a lot has changed, from the time I originally undertook it, initially, as a “side-project”.

Why the change?  Because of the “Linux Desktop Environment Mess” of circa 2010-present (April of 2013).  Term it the “LDEM”, if you like.  So what does this mean, in plain English?
What it boils-down to, is that every flippin’ OS that there is—from ANDROID to Microsoft WINDOWS to ~ whatever, has two (2) main parts:  there’s the underlying filesystem, which amounts to (for our purposes) what people call the “kernel”:  this is just the lower-level Master Control Program (MCP) that makes up the “core” of the operating system.  A kernel is basically like the old MS-DOS system:  NON-graphical, no point-and-click stuff, no pictures, bupkiss; just a black-and-white environ where one must type letters and symbols into the machine, in order to make it do something.  Fine for computer nerds, but rather cumbersome for the rest of us.  So DE s (Desktop Environments) were developed.  For our purposes here, this is often the same as saying “GUI”—a term you might have heard, which is another one of those three letter acronyms computer engineers love.  GUI means “Graphical User Interface”.
A real DE (Desktop Environment—Desktop Environ for short) is actually more than just a plain old GUI—a full-blown DE comes with extra functionalities built-into the software, so that it will often “read your mind”, and “automagically” do things for you (like, say, turning-off the glowing light in the thumb-drive you just unmounted with a Right-click, so you just “feel better” about pulling it out of the USB socket).
And of course, the Desktop Environment is the GUI software that runs “on top of” the “kernel”, and which lets us do things the “modern” way—by point-and-click, as opposed to the “old-fashioned”, “MS-DOS” way of computing.  Persons who really know their way around a computer often ** prefer ** the “old way”:  it gives them more “power” on the system, as they can fine-tune the processes they are trying to run.

I don’t know these people.  My friends are largely persons who just want to get a report edited and ready to incorporate into a presentation by Monday morning.  Or they just want to update their FaceBook page, Skype with their friends, and have some fun on the computer.
A really good DE (Desktop Environ) makes these things easy.  Windows XP had the LUNA program—though this is often confused with the classic wallpaper of XP (you know—the pastoral scene of the rolling green hills, with fluffy white clouds in the sky).  This was a good Desktop program, and it ran/runs on-top of the underlying filesystem that is WINDOWS XP.  WINDOWS Vista and 7 replace this with the “Aero” Desktop.  And in WINDOWS 7, the Aero DE is equally good in performance (from the standpoint of a person who doesn’t know or much care how the computer works, and who just wants to get something done).  Microsoft’s Aero is a lot * bigger * in sheer size than XP’s LUNA, and it can be slower to execute a task, especially on equipment that lacks sufficient processing power.
But LUNA and Areo are both pretty * reliable *, and neither seems to crash or freeze-up very much (** IF **, that is, you are using a version of ms WINDOWS that’s actually “good”—such as XP SP-3 or Windows 7; if you’re stuck with Vista, XP SP-1, or Windows Me, the workability and stability of LUNA or Aero ain’t gonna help you that much.
Just plain Linux (or, if you prefer, “GNU/Linux”) is just a kernel, on-top of which runs some DE (Desktop Environ program), if you’re wanting to use the system for things like, say, Small Office/Home Office, or Instant Messaging your girlfriend, or updating your personal webpage.  The Linux kernel is a * very * * good * kernel.  As a lower-level, NON-graphical filesystem, the kernel that is really GNU/Linux is a fine piece of software, with many remarkable capabilities.
A knowledgeable person * can * accomplish all these tasks without a GUI/Desktop Environ program, just using the Linux kernel as operating system.  But for the rest of us, there have historically been DE s for Linux, like GNOME and KDE (“Kewl Desktop Environment”).  KDE was about the first real Desktop Environ for use with Linux, and was developed in the 1990s.  For awhile, KDE was pretty much the only DE Linux users knew—or at least that we knew-of.  Then the GNOME project was begun, in part to support the GIMP image-editor.  Other DE s followed, and also some of the “window-manager” programs that were out there were “enlarged”, and so acquired many of the features and capabilities of full-blown Desktop Environments, like GNOME and KDE.
But it was essentially GNOME that emerged as hegemon, in the mid -2000s.  Or at least the closest thing Linux ever had to a DE hegemon.
Now (as of roughly 2010/Ubuntu 10.04, 10.10, 11.04, &tc.), what we see is that what was arguably the best DE for Linux (for us N00bs and “Lamers”—in other words, us NON-nerds)—namely the GNOME series-2 Desktop Environ (“GNOME 2.x”)—has come to an end, the developer who was (arguably) GNOME’s driving-force having quit the project in disagreement with the other engineers.  And the remaining people have taken GNOME in a rather new direction (GNOME 3).   GNOME 3 is rather “buggy”, for lack of a better term.  GNOME 3 seems harder to use, and lacks some of the features that “just worked” in GNOME 2.x.
Nor can one download GNOME 2.x, and run it on a modern Linux kernel:  it won’t be compatible.  You have to either use some release of desktop Linux that comes with an older kernel (like Ubuntu 10.04, which uses the 2.6.x kernel), or else maybe learn to manipulate Slackware or Gentoo Linux.  And I * still * doubt if you could get the old GNOME 2.x to run on a 3.x-series Linux kernel, even then.
The good folks who gave us Ubuntu in the first place—Cannonical, Ltd.—have created their own Desktop Environ.  It is called “Unity DE”, but sadly, at the time of this writing, Unity seems beset by many of the same issues that affect GNOME 3.  In truth, Ubuntu’s new Unity is really just yet * another * GUI-layer, that actually runs * on * top * of GNOME 3–but which provides yet another user-experience.

To try to clarify a bit:  Ubuntu’s Unity DE, Windows 8 Power Shell, and GNOME-3 * Shell * are all examples of something that’s * new * to probably most personal and work-station computer users:  these three are each * another * GUI “layer”, that runs * on-top * of the “real” GUI (like GNOME 3 itself or in the case of Microsoft products, the Windows 7 interface, which was/is called “Aero”.

Great.  So why not just turn-off Power Shell in Windows 8, and just use it like Windows 7?  Apparently, Microsoft doesn’t want people doing this, and makes it rather difficult to do [at least this seemed to be the case, at the time of this writing:  Reviews of the Windows 8.1 free update, that is supposed to fix user-difficulties with the GUI are still mixed reviews, as far as I can tell]. But as I’ve tried to indicate, things may have changed since this was originally posted.

In Ubuntu, we * can * just select GNOME 3 sans the Unity or GNOME Shell (s) at login-time, by clicking the gear symbol.  Unfortunately, many users report disappointing results with GNOME 3 itself.  [UPDATE:  This state of affairs seems to have been greatly improved since June of 2013.  GNOME 3 now seems like a more functional desktop.]

Ubuntu’s Unity seems to do a lot better, however, where a user is running Linux on a tablet device—where the screen is about the size of a pop-tart, give-or-take a few centimeters.  This seems true of the newest-generation of Desktop interfaces available for Linux, generally:  GNOME 3, KDE 4, Cinnamon DE, and Unity seem to possess features that help the user experience on small devices, that have small screens, and on which touch-screen functionality is of greater importance.  Unfortunately, this new-found functionality is also getting-in-the-way of “Traditional”, work-oriented use of a computer.  Not a few users of the Linux desktop have complained mightily, as to the difficulties of productivity and work-flow in the default configurations of the DE s I just named.

Not to be outdone, Microsoft has created Windows 8 PowerShell—which is (more-or-less) the appellation that has stuck to the GUI in Windows 8.  I’m still on Windows 7—though I seldom boot my Windows partition, as Linux Mint 13 XFCE Edition currently satisfies my needs.  But friends of mine who have recently replaced their computers have vociferously complained of a “cold shock” of “very unfriendly user experience” with Windows 8.  They continually carp about the ** exact ** type of usability headaches as users of GNOME 3 and Unity DE in Linux.  At the time I am re-editing this (late May of 2013), the buzz is to the effect that Microsoft is going to release a patch in June or July of this year, that will give WINDOWS 8 a Start button in the lower-left of the screen, reminiscent of Win 7 and other previous versions of WINDOWS.  No confident word yet, as to what the * price * may be for this “patch”–or whether it will cost WINDOWS users anything at all.  It’s enough to make you want to break your piggy-bank and buy a MAC.  [UPDATE:  The free patch from Microsoft, called “Windows 8.1”, is now out, and DOES NOT SEEM TO HAVE RESOLVED THE USER-FRIENDLINESS ISSUES WITH WINDOWS 8. A great many people seem angry and frustrated, from what I can gather. ]

Apple, Inc., seems alone in having kept its sanity.  Seemingly from the outset of the “netbook and ‘device’ revolution”, Apple decided they would offer basically two (* 2 *) paradigms:  there would be 1) MAC OS/X (for Traditional desktop users), and 2) “iOS” (& rel.) for “devices” (i-PODs, PDAs, smartphones, and tablets).  Which to my mind makes a great deal of sense.
The Linux Community-at-large, however, has been working furiously to try to solve this “GUI-headache”.  Interfaces like Cinnamon DE and MATE-for-Linux are in heated development.
All this has caused a number of people to abandon Desktop Linux in recent years (or so it seems—just ** exactly ** how many people are using a free-to-download operating system like the Linux-based desktop is hard to gauge).  But us core people are still here.  And as I said, programmers are working overtime, trying to make interfaces like Windows 8, GNOME 3, Unity, et al actually * user-friendly *.  By “user-friendly”, I am saying in terms of us “productivity” and SOHO users.  (SOHO = “Small Office / Home Office”).
So, in the article below, I suggest you pay particular attention to paragraphs having to do with “Desktop Environs”, “GUI”, and such.  There will be a shake-out, and (it is to be hoped), ONE “general-purpose” DE for Linux will emerge to take the place of good ‘ol GNOME 2.x.

#####################################

I have also striven to make this with a much greater tone of humility than some of the Linux blogs I’ve noticed, and to not assume that you already know as much about computing as many other postings on the web seem to think you should.  I still need to improve it in this regard.  And in some other ways, too.  There are only so many hours in the day.

I have attempted to arrange this blog in such a way that you are sort-of coerced into reading this document first, because it ties the others together.
I have tried to make this information clear, and without using unnecessary technical jargon, and no longer than needed.  Trying is not the same as succeeding, as we know.  Serious and civil comments as to how I might improve the style of this writing I will  take under advisement.
Please excuse spelling/grammatical errors/typos.  This is still a work-in-progress.
This document makes good reading—on a perhaps not-otherwise-committed weekend-day, when you have the house to yourself, and if you are someone actively considering trying graphical Linux on your laptop, tower, netbook, tablet or perhaps even your phone.
I do not have much time to maintain this blog nowadays, so I often don’t (sorry), but I am hopeful that the information is still helpful.  I am perfectly aware that THIS IS NOT A ** PERFECT ** DOCUMENT.  I wrote it to the best of my ability at the time.  I post it in the HOPE that it may yet HELP somebody who is having trouble with Linux, or who is merely curious about it, perhaps thinking of migrating to it.
This sucker is in need of some ** editing ** / ** updating **, and I’ll get to that as soon as time permits.
NOOB and NEWBIE ARE NOT terms of insult, as some people might infer:  A NEWBIE just means “someone who is NEW to the experience”—whatever that experience might be.

If you are at all interested in what I present here, you might like to check-out the link:  http://www.icpug.org.uk/national/linnwin/step00-linnwin.htm

Though this blog is not confined to the use of Ubuntu, there is a pretty dern good guide to the use of Ubuntu at this link:

http://ubuntuguide.org/wiki/Ubuntu:Oneiric#Import_PDF_files_into_a_word_processor

This manual is mostly one of NON-GRAPHICAL/COMMAND-LINE INFORMATION, however.  I have detailed other manuals, however, in the text below.
HERE IS THE LINK IN ANOTHER FORM:
Ubuntu:Oneiric –
1.3.1 How to find out which version of Ubuntu you’re using; 1.3.2 How to find out
which kernel you … 3.6 Google Desktop; 3.7 gDesklets; 3.8 Dock applications …
ubuntuguide.org/ –

NOTE that this above manual is written for Ubuntu 11.10 (“Oneric Ocelot”)—but there is also a link for a similar manual for 10.04 LTS (“Lucid Lynx”:  http://ubuntuguide.org/wiki/Ubuntu:Lucid ).  And for Ubuntu 12.04 LTS:  http://ubuntuguide.org/wiki/Ubuntu_Precise  Links for various Ubuntu releases are found in the box at left of the page-title.  This manual, like the one at the preceding link, is mostly one of non-graphical/command-line information, though.  But it can sometimes still be helpful.  If nothing else, you get a suggestion of what the name of the popular programs are, and what they do.  “Hip” Linux users sometimes replace some of these apps with more ‘modern’ counterparts, which people learn-of via networking.  But at least having the names of “traditional” apps will get you a foot-hold, and you can Google from there.  It is also very useful to search for apps in Software Center and Synaptic Package Manager:  you’d think * nothing * would beat Google searches, but apparently some really good Linux apps just get * used *, and—perhaps because nobody has trouble with them—knowledge of their existence just doesn’t make it to Google’s otherwise formidable data-base.
Most current Linux distros come with about all the apps you’re probly gonna need—more-or-less—except sometimes for some multi-media codecs and coding.  [Remember that you probably had to install Java and Adobe stuff to any new WINDOWS computer.  Modern desktop Linux is no different.  Ubuntu, for example, has the “Restricted Extras” package in it’s default repositories, and this will likely cover you for all the “proprietary” codecs you’ll need.  This package is assembled for us by Cannonical, Ltd., the makers of Ubuntu, and is built especially to be compliant with your install of Ubuntu.  It is important to go to Software Center in any new install of Ubuntu, and type “Restricted Extras” into the search-field (upper right of the program shell).  You’ll find a display of versions of this package for Kubuntu, Xubuntu, Lubuntu, and, of course, regular Ubuntu.  So Cannonical has made this * EASY * for us:  it installs with a couple mouse clicks, and the system configures it for you, in about 15 minutes or so—depending, usually, on how much processing power your computer has.  You only need to install it this once, and you probably won’t need other codecs, except maybe the ones for DVD playing.  Check the documentation for the specific release of Ubuntu you’re dealing with:  the Restricted Extras package might very well be all you ever need for third-party codecs in Ubuntu. ]
Note that the manual at this above link is itself indexed with links, so you can click on the specific question/area with which you need help.  I have detailed some “less command-line oriented” manuals, in the text below.
HERE IS A LINK TO UBUNTU’S ONLINE MANUALS, AND THESE ARE [A LITTLE MORE] ORIENTED TOWARD DOING THINGS GRAPHICALLY—BY POINT-AND-CLICK:
https://help.ubuntu.com/  [Ubuntu 8.04 through 11.10; soon to include 12.04 LTS ]
https://help.ubuntu.com/10.04/index.html [this is the official documentation for Ubuntu 10.04 LTS “Lucid Lynx”]
THERE IS ANOTHER GOOD BLOG FOR LINUX DESKTOP BEGINNERS AT:

http://www.reallylinux.com/docs/windowstolinux.shtml

HERE’S A REAL GOOD LINK FOR DESKTOP LINUX KNOWLEDGE [though it may be a bit dated]:  http://linux.about.com/od/linux101/u/userpathII.htm
Here is a link to “The Ultimate Linux Newbie Guide” and a LINUX WEB-RING. With a lot of instructions—though the home-page seems to be lagging behind the current state of desktop Linux development a bit:  it is featuring articles about Ubuntu 9.10 and 10.04, and a site-use survey for “fall 2011”—and I discovered this page in * January * of 2013.  Even so, it looks like a useful web-portal for those who are still using an older release of Ubuntu or other distro—either because they just prefer to do so, or because they haven’t found a newer release that their hardware will accept.  Links to chat-rooms, too—but I haven’t yet had time to see if they still work.
Here’s the link:

http://www.webring.org/hub/linux?w=853;rh=http%3A%2F%2Fwww.linuxnewbieguide.org%2F;rd=1

CONDENSED ENUMERATION OF THE STEP-BY-STEP ENTRIES in this document begins c. pp 59
—————————————————————————
Please be aware that, whether you are viewing * this * document from an Apple/MAC, or Microsoft WINDOWS, or from Linux, if you will just hold-down one of the ctrl keys on your keyboard, and then, with the other hand, press and release the letter “F” key (and then release ctrl), you will get a search-window into which you can type any word or phrase, having to do with something about which you are curious, with respect to the subject-matter of this writing.  Then you can just hit “Enter” to have it take you to the first occurrence of the word or phrase.  Every time you hit Enter, it takes you to the next instance of that word or phrase, below.  If it reaches the end of the document, because there aren’t any more occurrences of the search-object, it’ll tell you.  You can close this dialog-box to read something, and then use ctrl + F to open it again, and pick up searching where you left-off.  It does not hurt to do this any number of times.
This can be a lot easier and quicker than just scrolling-around in here.  Hit Enter, after typing in your word or phrase that you want to read about.  Close this Find-search dialog box by clicking the red X in its upper-right corner, if you are in some normal document.  When viewing something formatted to .pdf, the litte search-window can probly just stay where it is.  You will see the word or phrase highlighted, somewhere on your screen.  If the document program did not find a match, it will let you know.  You can then start with a new word or phrase.  Every time you hit “Enter”, the “Finder” will try to find the same word or phrase further-on in the text.  This works in most document-viewing programs, like Microsoft Word, WordPad, Open Office Writer, Corel Office, KOffice, Adobe PDF-Reader, and others.  In pdf documents, it works basically the same, but just a small window opens to the left.
Get to know this method of searching documents.  (It works on webpages displayed in your browser, too.)  This is a good personal computing skill to have, and everyone should have it, because it’s easy, once you get started.

I began this file as a very basic list, but it grew into a larger opus, my humble offering to anyone else who, like myself, would like to move my * desktop * usage from ms WINDOWS to a Linux desktop, for productivity and daily use.
Please note that many of the documents presented in my “database” begin with a lower-case “L” (l), at the front of the title.  This “l” is for “linux”, and serves me in sorting files on my own computer, and this “cue” is also in use for other purposes besides this blog.

If Linux Is So Great, Why Haven’t I Heard Of It Before Now, And Why Aren’t More People Using It?
An alternate title might be:  “Why Linux Might Not Be for You”.
Or even:  “Defend Yourself from Spies, Lies, and Prying Eyes”
Well, more people * have * been * using it; prior to the GNOME Desktop Environment Project blowing itself to pieces circa 2011-2012, Linux desktop usage was climbing.  See my article:  “The Linux Desktop Environment Mess, XFCE, and our ‘Home Away from GNOME’ (for 10” Screens and Up)”
The tone of most Linux blogs, I have noticed, is to proselytize Linux.  The authors want to sell it to you, though it is a free operating system, and so is not for sale.  I take a somewhat different tack.  I AM GOING TO TELL YOU THINGS NO GOOD LINUX-HEAD IS SUPPOSED TO STATE UP-FRONT, and I will do so in a frank and matter-of-fact manner, and, perhaps most unusually, in the foremost of my paragraphs—rather than nearer the end of an article.  At least once you get into the actual TIPS (below), with the numbered entries:  that is the part of this document in which I try to be really frank and candid.
Proselytizing seems to be the style in which most Linux-for-desktop blogs are written.  Mine will be different.
I have tried NOT to cast a pall over things, that is too much of a dark cloud.  I hope I have been successful in this regard, because I like desktop Linux and its concepts.
If you think all of this is a bit much, and perhaps not worth it, I will say right away that I have written the following entries from the “worst case scenario” point-of-view, and that while Linux can be somewhat fiddly to get it working on * some computers * (especially laptops and tablets:  Linux does much better, usually, on a tower or non-portable); desktop Linux can often be installed without a hitch.
I HASTEN TO ADD THAT THESE ARE PAGES CONTAINING WAY MORE INFORMATION THAN YOU’LL PROBABLY NEED IN ORDER TO START USING DESKTOP LINUX.
If you are concerned about the sort of compatibility issues as my aunt Molly—i.e. “What if I can’t use Internet Explorer?  Will the browsers in Linux—like FireFox, SeaMonkey, opera, and IceWeasle—will they open webpages, or will most webpages say ‘You have to have Internet Explorer’”?  Well, you * can * use Internet Explorer, as there are several ways.  But nearly all browsers in Linux to-day are about 99.5% to 100% compatible with any of the webpages on the internet.  And they’re as easy as Internet Explorer.  So you don’t have to use Internet Explorer, unless you want to.  It would be much better, really, to just backup your Internet Explorer favorites and settings, and then just copy these to a USB thumb-key, and then import them into Linux’s Firefox or Chromium, or other browser that comes available in a Linux-coded version.  These are legion.
Note that if your fears are not assuaged, as to whether or not browsers that are not Internet Explorer will display your favorite webpages, there is a program called User Agent Switcher (UAS), which will cause FireFox or another browser to appear to webpages as though it were Internet Explorer, and it is easily added.  But really, almost nobody needs to bother with it anymore, because to-day there are probably practically no websites left that continue to discriminate against other browsers.]
Myself, I use a Linux desktop as my daily os (Linux Mint 13 XFCE Edition)—and this almost * exclusively *—I haven’t booted-into my Windows install in * months *.  And I’m not a “computer nerd”.  More like a “nerd-in-training”.  I have yet to hit a webpage that says “you need Internet Explorer”.  And I do a * lot * of surfing.
Nor do you need to be concerned about “How will I get to my e-mail?”  Or “How will I connect to Facebook?”  Or “What about Skype?”  Or, “Can I watch You Tube in Linux?”  Or “Can I access Twitter, and use it normally?”  My response to such worries is “Quit living in the past!!”  FireFox and its “cousins” (just named) are supported by like 99% of all websites to-day, and have been for quite some time.  (So is OPERA, which is not a Mozilla-relative, and which is very much like Internet Explorer, and is free-of-charge.)
Nor do you need a special client to do your e-mail anymore:  these days, you just go to your e-mail’s generic log-in page with your browser (FireFox, SeaMonkey, OPERA, or whatever) and log-in.  It’s that simple in Linux, and it has been that simple in WINDOWS for quite a while now, even if you didn’t know it.
Most desktop Linux distros come with a full suite of applications, so you probly WILL NOT have to install much in the way of actual * applications packages *.  If you want to add something, well, the major distros have gone to great lengths (in recent years) to make this easier and easier.  Most things can be downloaded and installed with a few mouse clicks, and your confirmation with your password.
You don’t have to drag your Windows apps with you, over to Linux.  Just about every app you can think of from Windows has at least one Linux counterpart, and most work just as well.  Especially for an ordinary home-user/family/social-user.  Professionals—like journalists, photographers, small business owners and accountants—are now using desktop Linux.  Modern desktop Linux has “grown up”.  And it gets better all the time.  I will say, though, that certain professionally-oriented apps (like sound-mixing, videography/moving-picture editing software, high-end CAD/CAM, certain office-apps) are still not up-to-par in desktop Linux, at the time I write this.  It depends on just what your profession is, and exactly what you’re trying to accomplish with software.  But there are now many professionals out there using desktop Linux.  So I guess it just depends.  But as I’ve said, this “migration guide” is aimed at us newbie, casual/social and SOHO (Small Office/Home Office) users, and moby (laptop, netbook and tablet) users.
Apps * written * for * Windows * will generally NOT run in Linux.  Most apps you will know from WINDOWS have a Linux “alter-ego”—a Linux twin that works just as well.  Some actual Windows-apps will run in Linux with a “comatibility-layer”, such as  the Wine program (free).  Others will not.  Apps written for Linux will not run in Windows, though some have a “Windows version” that does, and I think there is a compatibility-layer program available (free), with which Linux apps will work better than when we try to run Windows apps in Linux.  I hope I was clear here.
The bottom-line with this—at least to * my * way * of * thinking *—is that we should just look for a native Linux app first.  Often it is right under our nose:  just look in Ubuntu’s “Software Center” at all the stuff that came already installed, as part of the release.  Just open Software Center (“Software Manager” in Linux Mint—same thing), and click on where it says “installed”:  this will display just the current roster of apps that are already present in the system.  To download and install an app that is desired, just go back to where it says “Get Software”, or “Install”.  Any time you try to install something that is already in your Ubuntu system, it’ll let you know, and ask you if you want to cancel or proceed, just like ms WINDOWS.
Other Linux distros have ways you can check on what is already there.  If it’s not already there, try getting it from your default software repository.  In most of the distros of which I treat here, this is easy to find.  Reading the user-manual, or a little Googling can help you.  Should this not be enough (relatively rare these days, for those of us who are ordinary users), there may be non-default sections of your repositories, which you can enable, so that you can access them.  This can usually be accomplished with a few mouse-clicks, but you should research it first, because your distro’s maker often does not take full responsibility for stuff from non-default parts of a repository, PPA-stuff, or experimental-ware (“future-verse”), or perhaps even some jazz somebody anonymously posted on the web for free download.  Be smart, and have some kind of a handle on what you’re installing—before you install it.  This can save time and trouble later-on.  And this is not much different from using WINDOWS.
It’s actually worse in WINDOWS, because even though you * think * you removed all the traces of the malware that came with that goofy music-app you installed this morning, that may not at all be the case, because 1) Windows has been so malwared, over the decades of its popular existence; and 2) because the inherent legacy-structure of its file-system makes it easier to malware than UNIX/Linux/MAC.  Using System Restore, dialing-back to an earlier “last known good configuration”/”Restore Point” DOES NOT, unfortunately, remove all the injected code and key-logger spyware that some inadvertently installed malware may have put in your WINDOWS system.  I can state this from * experience *, as can many, many others.  This is a legacy-fact of these operating systems, and I find it is not likely to change.  I base this conclusion on my own use-experience, such as it is, and in reading and studying on my own.  I should hasten to add, I suppose, that since this document was first written, it has become  increasingly apparent that Microsoft has (* finally *) gotten ’round to closing * a * lot * of the security holes in WINDOWS (Windows 7).  System updates/security-patches for Windows (7) have also become more timely.  (These are only any good to you, of course, ** IF ** you actually allow them to download and install when they are offered.)
But I still have it on pretty good authority that WINDOWS * itself * keystroke-logs everything we do, and reports this data back to Microsoft.  Which doesn’t bother most users, or so it seems.  Google does the same—indeed, that’s essentially Google’s business-model:  they collect data that can be used to market to us in a narrow-focused way, and that’s what makes the user-data valuable.  Which is how the Google Corporation, basically, pays its bills.  For those of us who still object to this “tracking”, however, there are ways to use desktop Linux to avoid user-tracking, key-logging, spyware, and surveilance, generally.  Even for those not particularly concerned in this area, Modern Desktop Linux has other virtues that make it a nice * desktop * system.  If you’d like some better support for this position, a significant link I discovered just the other day would be:

http://linuxmafia.com/~rick/faq/index.php?page=virus

Internet Explorer will (probably) run in Linux with a compatibility-layer—there are several available, and these are free-of-charge to download and use, just like Internet Explorer itself.  Mozilla FireFox (the free version, essentially, of Netscape Navigator—but much improved and updated), will run wonderfully well in Linux, because it has a Linux version.  So does OPERA, which I have recently installed, and I’m considering it for use as my main browser.  [At the time I write this (early 2012), I  am using FireFox 18 (at the time of this writing) on Linux Mint 13 XFCE Ed.  I find it works wonderfully well for everything, except certain MMORPEGs (massively multi-player online role-playing electronic game).
Linux, like its third-cousin twice removed Apple OS/X, is game-deficient in several areas.  But the rest is pretty solid—and it gets improved all the time.]  FireFox has had a Linux version for quite sometime (actually, probably since day-one).  So do most of the major web-browsers you’ve heard of.  The OPERA web-browser is free, has both a Linux version AND a Windows version, and is a lot like Internet Explorer.  Just about all the same browser plug-ins and add-ons available to WINDOWS have a Linux-coded version, and the browser knows which is which, so you can’t install the wrong add-on by mistake.  Many of the web-browsers that we can use in desktop Linux also have WINDOWS natively-coded versions:  Firefox, OPERA, SeaMonkey, Arora, Lynx-browser (this one’s non-graphical), and others.  This is also true of a number of popular free/FOSS GUI programs:  GIMP, UNISON, VLC, & tc.  So these programs each have a version built just for Linux, and versions built just for Microsoft WINDOWS.
If you want to use a separate e-mail client, though, there are plenty of nice ones in Linux, and several which are capable of everything their Windows counterparts can do.  Thunderbird would be one to check-out.
I guess I need to add, before moving-on, that Ubuntu loads an older, “stable” version of most programs from its default repositories.  So when I installed SeaMonkey, that’s what I got.  No other webpages balked—only Microsoft’s Live Hotmail (though now I see where I should’ve clicked the first time:  so it’s really not an issue).  And I’ve done a * lot * of surfing with SeaMonkey.
Social-media apps are well integrated, and have been for some time.  Skype, Twitter, and Facebook will all work just fine.  The Google services are  the best—Google Talk, Google IM, etc.—they seem to work for everybody—whether you’re sending a message from a Linux computer to a Windows one, or vice-versa.  And they’re * FREE *.
And to-day’s Linux can read and write to the modern NTFS files-formatting Windows uses (it has been able to do this for quite some time), plus all the other stuff—with the possible exception of .docx.  This can be remedied by just putting stuff in the classic Microsoft .doc format, which is arguably a good practice anyway, because every program can read it.  [UPDATE:  since the time I first wrote this, compatibility of office programs in Microsoft and Linux has improved:  by 2011 or so, only people with * old * versions of MS Office (& the like) would seem to be in much difficulty; MS Office 2007 + up seems to be able to read .odt just fine, and Linux’s flagship Open Office can read .docx.  So just about everybody can just breathe-easy now, where documents-formats are concerned.]  Linux apps like Open Office or Libre Office can do this easily.  There are folks out there with older machines, that only have ms Office 2000 installed, and who are afraid to try anything else.  There are even a couple of professors I know.  So they cannot open .docx—even though they are using Microsoft and Windows XP!  (I don’t especially * like * .doc or .docx:  but as I said, more people know how to open .doc, so we are kind of stuck with it—at least for a while longer.)
Modern desktop Linux can do about 98% of everything that WINDOWS can do.  This includes live-chat, P2P File-sharing, photo-editing, MP3-player management, spreadsheets, ripping cds, burning files to DVDs (even in formats compatible with WINDOWS !).  And about anything else you can think-of.  Linux to-day even has some capabilities which Windows does not, like its ability to recover files from a failing harddrive—sometimes more effectively than Windows/DOS programs.  And to be run as a “live file-system”, * with * persistent-saving, so that minor changes you make to settings and softwares will “persist” from one re-boot to another.  Yes, one * can * run ms Windows as a live, compressed-image (“live-filesystem”)—for example, this can be done using virtualization software, and a “live-image” of Windows downloaded from the internet—in other words, “running Windows from inside Windows”:  but this would violate Microsoft’s EULA, not to mention various other legal strictures.  And I don’t know how one would be able to set up persistence.
It is also probly true that WINDOWS 7 can do about 98% (or so) of everything that desktop Linux can do.  But this leaves two or three percent that can be * very important *.  Especially from a system privacy-and-security point-of-view.
There are equivalent programs for just about every app that you could run in Microsoft Windows, and 98% of ‘em are free.  Most Linux apps work just as well as those in WINDOWS, though I can think of a couple that don’t quite make it to par.  Desktop/graphical-environment Linux is also much better documented (online instructions) than it used to be.  To-day, Linux is * just * as * well documented * as WINDOWS, where it comes to * most * questions or issues.  I feel compelled to add that I recently had to solve a problem with WINDOWS (Win 7), and it was very * hard * to find the information.
Linux is immune from most malware and spyware, just the way it comes.  This can be considerably improved, however, after the system has been set-up and is running.  And to a standard that far-exceeds * any * install of ms WINDOWS.
Modern desktop Linux now runs on laptops, netbooks, tablets, mobile-devices, and phones.  You no longer need a special version of desktop Linux for use on a netbook or other device with a “smaller-than-normal” sized screen; since roughly late 2009, the major desktop distros have incorporated the necessary resolutions-settings and other programming, that usually allow the distro to “automagically” recognize your screen size and adjust to it during boot-up.  Puppy Linux does not necessarily do it this way, but rather uses its historic method of a “screen resolution wizard”, which helps guide you through it, graphically, and with a modicum of fuss.
When you Google something, your Windows computer is connecting to a Linux machine, at the other end, because Google uses its own special build of Linux, and Linux became highly successful in the server market in the 90s and 2000s.  To-day, Linux controls about 51% or more of the market for server operating systems.  So when you go on the internet, and go to a webpage nowadays, chances are better-than-even that you’re connecting to a Linux-box.  Even if you don’t know it.
And desktop Linux is ramping-up for Cloud Computing.  (Check-out PeppermintOS.)
I read somewhere recently that “If you had trouble setting-up Windows, Linux is not for you.”  Well, I say those people are just * quitters ! *
Desktop Linux can be “fiddly” to get running because computers are built and tested for Microsoft products, and not for other operating software, such as Linux, BSD, or Menuet to name but three.  (Menuet, of course, is not part of the “* nix-family *, but is “its own thing”.)  Microsoft products (essentially the WINDOWS operating system) probably came already installed on any of the computers you’ve ever had.  And this most often means it was put there at the factory, or by some knowledgeable person.  And so it may have been tested by industrial means, or stress-tested to check for performance-problems before the machine was sold.  Some “tweaks” may have been performed, too, to make WINDOWS run right on that particular box.  Yes, there are also people who pirate WINDOWS.  I tend to think most of those are probably CS majors who just lack  the money to buy a descent computer.  Maybe I’m wrong.  In any case, WINDOWS 7 is  harder to pirate, and I do not want such thinking to be the thrust of this blog.  Especially when there is a whole Linux universe to explore, and it’s open-source.
Linux is something you must install yourself, if you really want to get the full benefits available to you in desktop Linux.  Or else try to find a reputable computer shop to do it for you.  Good luck to you, if you’re intent on finding a shop willing to install it to dual-boot (where your ms WINDOWS install is preserved, and you can use WINDOWS too).  Some machines don’t do this arrangement well, and the computer shop is probably aware of that.  And they have a lot of sick WINDOWS computers in the back, infected with viruses or out-of-whack due to operator-error, waiting to be fixed.  [I should interject here, that many people “install” Linux as a “virtual computer system”, using programs like VMWare, or VirtualBox.  That way one can run Linux from within WINDOWS, just as if it was any other app.  This method also is one that is easy to un-do, even for persons who don’t know much about computers.  This is not the path I took, though—for various technical and personal reasons.  So I don’t have much to say about it at this juncture.
I do hear that this way of running Linux suffers * somewhat * from difficulty interfacing with other drives you might want to use on the system—like a USB thumb to store data— or perhaps an internal partition for data-storage—and that there are “more frequent networking problems”.  Most of such issues may have been fixed in recent years.  But I tend to still favor running Linux from a USB thumb-key (or maybe even better an external USB harddrive—if your machine’s BIOS can be configured to allow it, and most of ’em nowadays can be).  There are even ways to boot Linux from a machine’s BIOS that does not natively allow USB-booting.  More about this later.  Be sure to read the comments threads (below):  others will post. ]
So most people install Linux themselves.  [NOTE that I am also going to be posting other information at this site, as to how to get the full benefits of Linux and its customizability, speed, and security, * without * touching your harddrive.]  I myself have a dual-boot on this very laptop, the machine having come from the office supply super store with WINDOWS 7, and I then having made an Ubuntu install to the harddrive.  If I had followed the instructions I found online, and compiled a custom kernel,                                         [ http://homeport.org/~bcordes/satellite-l500-install.html ] Ubuntu may have run a lot smoother on this Toshiba laptop.  [It is a L515 Satellite:  these instructions are for a L500; but that seems to be about as close as I could find, and is in fact  a lot closer (to the exact make and model) than the instructions I can find for the washers, dryers, and other appliances I have to repair at my DAY job.]  But I notice that with the few tweaks and settings-changes I did make, Ubuntu 10.04 runs acceptably well on this laptop:  I * have * a usable system.
Toshibas do not have the best reputation for running Linux, anyway.  Nor is this company’s reputation the worst, in this regard.  If I had known I was going to develop an interest in Linux, I tend to think I’d’ve bought an HP laptop or maybe a Compaq (essentially the same animal, since HP has owned Compaq for some time now).  Thinkpads also seem to have at least an above-average reputation, where it comes to running Linux.  But as I say further-on in this document, components vary so much—even within the same production-run of the same model(s)—that there is no guarantee as to which make/model of computer will reliably boot and run Linux, and there’s no absolute way to tell in advance.  I will add only that Ubuntu seems to  be running pretty descent on this laptop now, with the adjustments I have made over the past many weeks, in my spare time.  SO REALLY, GOING BY BRAND-NAMES FOR THIS PURPOSE IS ** NOT ** A TERRIBLY RELIABLE INDICATOR.
If I had it to do over again, I’d probably still do the hdd install, if for no other reason than I like to do some (very moderate) experimenting.  But there’s an easier alternative.  Desktop Linux to-day can be installed and run from a USB thumb-key.  There are several ways to do this (see Appendix A of this document), which I have tried to detail in this blog.  Another worthy option is an external, add-on harddrive, or a micro-drive (mini-external-harddrive).  If you have the money to buy an external harddrive, many times this can be a good way to get around the headache some computers have, when it comes to dual-booting Linux from the same drive which contains MS WINDOWS.
But it’s not an absolute cure—it doesn’t always solve the problem—if you have the problem in the first-place, which a lot of machines don’t.  My USB-powered harddrive only cost about 100 USD, at the time of this writing.  But that can be a lot if you haven’t got it.  It is probly also true that an install to an external harddrive is easier to undo or re-format, than the traditional dual-boot setup.  Provided you are able enough with graphical disk-tools that you don’t accidentally modify the wrong drive or partition—see the tip having to do with installing Linux, further-on in this document.      Note that these type of add-on drives were not invented with the intention that they would be on and running all the time, but were rather intended more for things like storage and backup.  However, technology advancing as it does, many people on long flights now seem to have them up-and-running for many hours without issue.
Remember that an external harddrive has no built-in cooling, so an external way of helping it dissipate heat may be advisable.  (Like maybe sitting it on a laptop cooler or netbook cooler).  As there is no protective chassis and computer case, anything that protects against unexpected vibration might be a good idea too.  Perhaps just operating it while it lays on one of those gel-filled vinyl masks people sometimes wear— across the region of the eyes to soothe and cool the blood vessels—might be an idea.  Such a thing will absorb and dissipate a fair amount of heat, and is good at dampening vibrations, such as those that can sometimes occur in takeoff and landing.  Drugstores often have them to sell.
It is also possible (depending on the hardware) to remove the harddrive from a junk computer and physically “mount” it inside an enclosed or semi-enclosed “chassis”, which you could make yourself.  Some of what the construction trades often call Luan (Loo-awn), a.k.a. “doorskin”, or even odds-and-ends that may be available could be good to use for this.  Be sure it has adequate “vent-holes”, so it can dissipate heat.  A link pertaining (somewhat) to this is:

http://www.justlinux.com/forum/archive/index.php/t-148499.html

And let’s not forget about VirtualBox and VMWare.  These are just two of the better-known programs which allow you to run desktop Linux as a “virtual machine”—an imaginary ‘other computer’ * inside * the ‘imagination’ of your own WINDOWS operating system—and this just by downloading and installing either VirtualBox or VMWare to your WINDOWS system.  This seems preferred by many people, and is easy to un-do.  However, some people report problems with this method when they try to get Linux to copy files to another disk (like a USB thumb-key) or to “surf the web” with its browser.  I don’t know much about this “VM” method of using desktop Linux—my own [limited] experience runs toward the other methods I have talked about.  As with * any * alterations to your ms WINDOWS system, you should independently research this method ahead of time, if you intend to try it.  This is an old rule.
Start out with live-cds (or live USB thumb-keys:  most USB-installers for desktop Linux produce a “live-disk” type install—but it really doesn’t matter if it’s “live” or “full, traditional hdd-type” install:  desktop Linux won’t make any changes to your harddrive unless you authorize it).  These do not make any changes to your harddrive, as they run wholly from the computer’s RAM.  This is what I did.  Boot one and play around with it.  You could just get a 10-pack of cd-R s at your local discount-store—they’re basically cheap nowadays.  Frankly, so are USB thumb-keys (“jump-drives”).  But it is IMPORTANT to spend the extra few bucks, and get the Quality ones, because we’re going to be playin’ around with an * operating system * here.  And if * that’s * poorly copied because of cheap discs or other media, then all bets are off.
All of the major Linux distros and nearly all of the minors are nowadays offered as a freely downloadable file, and when that is burned to a cd or DVD this file will become a bootable live Linux disc.  There are several files-type in this category, but I prefer the .iso-file.  It is complete, and its integrity is verifiable with any one of several “checksum utilities” that are downloadable to Windows, and are free.  I just find the .iso file to be a more newbie-friendly way of getting Linux than, say, torrent or .rar—to name but two.
I like HashCalc 2,02 for WINDOWS to verify the Md5 sum of any Linux distro that I downloaded as an .iso file.  It is very simple, and only displays .iso s you have downloaded, by default.  This makes it easier for us noobs to pick-out the right file to check.  And it runs in XP, Vista, and Windows 7.  It can also be downloaded to Windows from secure (relatively) sites like C-NET/Download.com.  This is nice, because we don’t want our Windows to get infected with some malware while we were in the process of getting our first Linux to try.
You should do some research, though.  Because just like any download to Windows, there is a hazard of malware and spyware.  Get some user reviews of the program you intend to use.  Make sure your anti-virus software is up-to-date, before you download it.  Always create a restore-point first.  Try to download only from reputable sites.  If you right-click  any downloaded file before opening it, the context-menu that pops-up will usually give you the option to scan the file with your anti-virus software.  Malwarebytes lets us do this too.
Some sites I find to be good to download this kind of stuff from are as follows.  But just because they seem to work without much issue for me, doesn’t necessarily mean it will be the same for you and your computer.  As with any (Linux) advice on the web—or maybe any kind of advice at all in your whole life—you should research it at least some first, and you “use it at your own risk”.  Note that some of these sites are essentially just for WINDOWS stuff.  I’ll mark the Linux-y ones with a “ * “.
C-Net (used to just be known as “download.com”–and that’s still part of the official name, if you look closely.)
FreshMeat  *
Tu Cows
SourceForge  *
Softpedia    [not strictly for Linux stuff, but has a * lot * of Linux to offer]
Linux Questions.org  *
The Debian Foundation  *
Kernel.org  *
Code Ranch  *
Ubuntu Forums  *
I can’t think of any more right now.  I’ll add to it later.
Many of the modern distros, By The Way, have live-support available, online.  Most have an IRC channel.  You can obtain help for almost any particular program or distro in the irc server irc.freenode.net (example: #debian, #ubuntu, #python, #FireFox, etc). You can find user communities also in irc.freenode.net.  A link I currently use for Ubuntu’s live-chat help-channel is:

http://irc.netsplit.de/channels/details.php?room=%23ubuntu&net=freenode

Just choose a username for the duration of your chat-session, and click “continue”.  If there are a lot of ppl on there (which is usual), then you’ll have to be as patient as you would be if you were going onto, say, Computer Hope live, in order to get some free advice about Windows.
Modern desktop Linux is easier now than ever.  [Well, except for the “Desktop Environment Mess” of roughly 2011 – currently:  to which we will come, further-on in this paper.]  When you’re done with a play-session, just use the mouse to shut it down.  (Or see the entry as to shutting down Linux, below in this document.)  Learn at your own pace.  When you’re ready, install it to a USB thumb-key.  Boot and run it from there for awhile.  Devote some spare time to it as you wish.  Don’t go overboard, and turn it into work, trying to learn it in some marathon, “all-nighter” session.  Linux is perhaps best learned as Play, rather than Work.  Keep at it, a little at a time.  This way you’ll have migrated to Linux, without having done the “Work” of migrating to Linux.
If you become dissatisfied with a thumb-key install (as this way of running modern Linux is still * somewhat * experimental), then let me suggest that you try installing to an external harddrive.  This is not that difficult (at least where it comes to Ubuntu and its variants).  And it can solve a number of bugginess-issues where using desktop Linux on a laptop is concerned.  It can be worth a try.  [I intend to post more extensive documentation respective of this, as time allows.]
Or there is the traditional way, which is a dual-boot, side-by-side install to your internal harddrive, which is also handled by Ubuntu’s own installer (the Ubiquity program, which is graphical/point-and-click).
I suppose I should also interject at this point, that * some * (meaning a relative few) machines have difficulty with the dual-boot arrangement, and so therefore if you decide to install Linux the traditional way (to its own partition on the harddrive), then it may turn-out for you that this condition contributed to Linux’s “fiddliness”, which is another reason I tend to gravitate towards a thumb-drive install (see Appendix A)—or an external, add-on harddrive, or a micro-drive (mini-external-harddrive):  and you might also check-out virtual-machine softwares, like VMWare, which  are also free to download and use.  The only way to hope to know beforehand is to research it.  Online, and by joining a LUG (Linux User’s Group), and by whatever other means one can find.
A geeky friend who regularly picks-up her voicemail is a plus.  Learn all that is available from your machine, in terms of its specs and what hardware is installed.  Then research this, as to Linux-compatibility.  It will take less time than you think, and you may even find it enjoyable.  Then, try-out a desktop Linux distro or two as a “live environment” (boot from a cd/DVD you’ve burned, or perhaps better, a bootable USB  thumb-key you can make pretty easily, with one of the free GUI programs to use to do this on WINDOWS:  Universal USB Creator From Pendrivelinux.com is a good one).
Migrating to Linux * does * require some investment of personal time and effort, even at this late date (2011/2012).  [CAVEAT: The BIOS of some machines (mostly older ones) Does Not have the ability to boot from DVD—some older machines will boot from cd only. And sometimes this is because of the cd-tray itself—or BOTH the cd-tray AND the BIOS don’t support booting from DVD, but can boot from a cd. Also, the BIOS of some machines (especially older ones) does not have the option to boot from their USB hardware port/software interfacing.  This can be overcome by learning to make a special type of “boot-helper cd”.  You could look at the file “l boot from usb when bios is not able”, or Google the issue if it crops-up for you.  Sometimes this lack of USB-connection bootability can be remedied by just updating the BIOS’ firmware (the miniature operating system that drives BIOS).  This is sometimes referred-to as “re-flashing the BIOS”.  Better let a professional tackle this one, unless you’re already WELL on your way to being a hardware tech professional.  If a BIOS re-flash procedure doesn’t come off right, the machine will probly not be able to boot anything again, until-and-unless it is straightened-out.]
I guess I will add right here, as a kind of special note, that since * at least * version 9.10, Ubuntu’s Installer program (which you can activate from the live-cd) seems to have been much improved.  If you happen to  get to reading an old thread on the internet, and people are talkin’ about how they had to partition their harddrives themselves, and what a pain it was, and how they then had to copy the system files to the harddisk, &tc &tc, be sure and check the * dates * of these posts, as to * when * the authors posted them.  AND BE AWARE NOW, that Ubuntu’s Installer nowadays does all this * for * you (assuming it’s working right on your hardware), if you choose to install it to your system’s internal harddrive, “side-by-side” with WINDOWS.  (Unless perhaps you choose “custom install” from the menu for some reason).  So you probly won’t have to run G-Parted yourself, or any of that crap.
It’s IMPORTANT, though, to give any distro you’re serious about a thorough testing as a live cd, DVD, or USB thumb-key.  This way, it doesn’t touch your harddrive (unless you tell it to),  and you get a realistic idea of how it would perform on your hardware, should you decide to go ahead and install it.
I have installed Ubuntu 9.10 to the harddrive of two machines, and both went-off without a hitch, and each install took about 90 minutes of my time.  (I currently use Ubuntu 10.04 LTS, BTW.  I’ve installed to some other machines, too—and many partitions on the equipment I do have—but it is not enough to make me an “expert”.  Like I said in the opening paragraphs, I only write all this because I like Linux, and in the Hope it may help others.)
Be prepared, though, if you are intent upon installing any Linux distro to a machine’s internal harddrive:  read the manuals, and the rest of what I (and others) have to say here, and make sure your battery is full if you’re gonna try this on a laptop.  And be sensible, as to when you plan to undertake it, if you are gonna mess with the harddrive-method:  don’t plan it for that morning when you know it’s gonna be your turn to pick-up the kids from soccer practice.  Because it might take you longer than 90 minutes, Dude.  And for goddsakes be sure you have backed-up all your data first—like to removable media such as cds—and defrag WINDOWS, then probably open Windows Updater and download and apply any updates that it can find.   And then create a resore-point.  These are easy things to do—especially in WINDOWS 7 or WINDOWS 7 Starter-build.  Especially if you are gonna install a Linux to the machine’s INTERNAL HARDDRIVE.  [The equivalents of these procedures, * where applicable *, are just as easy in most modern Linux, BTW.  You don’t have to defrag in Linux, for instance, because Linux has a better file-allocation and write-to-disk modus. ]
[ I don’t wanna sound like your naggin’ mum, but, have you gone to Windows’ Backup and Restore—like within a day (or maybe a few minutes) of getting your Windows computer set-up—like * BEFORE * you went online with it, and created all your backup disks—like your backup boot-disc set that Windows * used * to come-with from the vendor, and nowadays often doesn’t anymore?  Or your Windows 7 repair disc(s) (which is a little different again, but also recommended by Microsoft); or your Administrator password reset disc?  And/or have you used Backup Center since then—like every Sunday night before bed—like before shutting-down the computer for the night?  You do shutdown your WINDOWS computer for the night, at least every once in a while, don’t you—like so that recently published updates can be applied?  And what about those WINDOWS 7 or Vista updates that are waiting to be added?  If you do not know what these things are, or at least have a rough idea of how to perform them, then you should Google them, and, once you have done them, return here, and pick-up where we left-off.]
IMPORTANT NOTE:  If you install desktop Linux to your computer’s internal harddrive, or you otherwise alter the size of WINDOWS’ “living-space” on the harddrive (they call such space a “partition”), then after you’ve finished the install, and you perhaps let the machine boot Ubuntu (or whatever Linux) for the first time, to see how it did, THEN YOU’D BE WISE TO SHUTDOWN, AND WAIT A MINUTE, AND BOOT-BACK-UP, * THIS * TIME * SELECTING * WINDOWS.  Because WINDOWS own boot stuff keeps track of its own partition size, and will stubbornly refuse to boot if the partition size has been changed ** twice ** between boots.  So any time WINDOWS partition has been altered, WINDOWS should be booted soon.  It does not have to be right-away.  But WINDOWS should definitely be booted before any ** other ** changes are made to the disk, just to be safe.  It’s okay to boot the os you just installed, to see what nice work you did.
BUT DO NOT ** FORGET ** TO BOOT WINDOWS, ** BEFORE ** YOU MONKEY-AROUND WITH THE DISK-CONFIGURATION AGAIN.  Just to be on the safe side.  This is actually a kind of DRM-ing Microsoft adds to their product, to make it harder to pirate (that means to illegally copy, without paying for it).  Linux keeps track of its size on a disk, too:  but it is not programmed to refuse booting if the partition size has been changed more than once.  Why wouldn’t Linux do that too?  Linux is FREE (as in * free speech *, anyhow), so you can’t “pirate” it (well, not most distros, anyway—unless you copy it to cds and sell them for money, or else make your own version of, say, Ubuntu, but still call it Ubuntu).  Read a little about the basics of the GPL/General Public License that Ubuntu and most Linux distros use, which is an alternative to “traditional” copyright and/or intellectual property protections:  Google it.
But ms WINDOWS is not free, so a rule of thumb is to always remember to boot it, after working on a harddrive, just to make sure it is okay.  As I may have already indicated, some computers just have considerable difficulty with this “traditional”, side-by-side, “dual-boot” arrangement, and for reasons that are just difficult to nail-down.  If your WINDOWS balks seriously at having another operating system on the same disk, you might be forced to restore the disk, and dump Linux.  This is comparatively rare.  But it does happen.  So there is a risk one takes in doing the traditional dual-boot arrangement, as a means of getting a “full install” of desktop Linux.  That’s another reason why you should’ve made your WINDOWS os recovery-and-restore disks in WINDOWS Backup and Restore Center [Start > Control Panel > Backup and Restore].  This should always be done—any time you get a computer for your frequent use.  More about this in a bit.
What these last several paragraphs (pertaining to internal harddrive installs of Linux) add-up-to, is that I think I’d start-out in messin’ with Linux by usin’ USB thumb-key(s), if I wuz you.  This method of getting Linux is becoming increasingly popular anyway, and for several reasons I will touch-on in here.  It is good advice for us noobs—and maybe even for more experienced WINDOWS people.  Another worthy option, as I have already said, is an external, add-on harddrive, or a micro-drive (mini-external-harddrive).  Or perhaps Virtual Machine.
Let me re-iterate here, that NOOB and NEWBIE ARE NOT terms of insult, as some people might infer:  A NEWBIE just means “someone who is NEW to the experience”—whatever that experience might be.
In the coming years, perhaps migration from WINDOWS to Linux will have been made even smoother.  Programmers and coders are working on this—at least sporadically, as time permits.  But for now, it is like it is.  If you think this is inconvenient now, you should look online, and see what people had to go through to use Linux back in the SLS-era.  [Installing from multiple floppies back then could take a really long time—whether the os being installed was WINDOWS 3.0 or, say, one of the early (relatively) incarnations of Slackware Linux.]  Linux for the ordinary user has come a long way.
I am aware, however, that knowing all this is perhaps scant comfort to a normal  user who wants to migrate to Linux.  It is to this end that I have tried to elucidate and otherwise mention every comfort, convenience, and workaround that I can think-of and find, and to try to present these in an understandable form.
In whatever case, once you’ve done an install (whether to a USB thumb-key or other media), you may have to do some “post-install tweaking” to get Linux working right.  Personal computing is complex enough, anyhow, and therefore most people will want to make some adjustments to pretty much any system they buy, instead of just continuing to use it like it came.  [For more on USB thumb-key installation, see Appendix A, c. pp 212.]
It is also true that there are computers on the market that come with Linux—towers and laptops:  but these probably come ** ONLY ** with Linux.  And so if you buy one of these, you would not have WINDOWS as a “fall-back” position (unless you were to install WINDOWS from a disc, or have somebody do that).  It is also often true that in order to install WINDOWS, you have to wipe-out any other os that might be installed to the disk, because WINDOWS is built with the assumption that it is the only pc operating system in existence, and there “simply aren’t any others”.  So the person doing this may well have to do a backup of Linux (easy), and then install WINDOWS (sometimes moderately easy, sometimes very hard), and then re-install Linux (usually pretty easy).  And frankly, I tend to think any of us will probably find ourselves in need of the WINDOWS backstop at some point.  If not for occasional use as we get used to Linux, then perhaps for no other reason than that  we are in a strange town, and we need to print-out something from our laptop at the public library.  The librarian is helpful, but her printers only support WINDOWS, and maybe also Apple/MAC.  Then you think:  “Hey, I could run that document through my WINDOWS partition, using Samba”.  Or you could just use a thumb-key (a.k.a. a flash-drive).  I will add that there are also other ways to deal with this type of issue, and without the resort to Microsoft WINDOWS; but it is just easier for those of us new to Linux to have a WINDOWS install to lean-on, until we get our “sea-legs”.
If you do decide to purchase a computer with Linux already installed (like maybe a System-76, for example), I counsel that you research it some first.  Get some reviews online (or from another source as well), and try to get them from some users who have actually had the thing for awhile—not somebody who just got it home from the store.  Consumers mentally associate Microsoft Windows with the physical machine itself, pretty much whatever brand-marque is stamped on the case, and vendors (that is to say manufacturers and retailers) are fully aware of this fact.  But if you get right down to it, it’s * NOT * analogous to buying a new automobile.  If I go to a showroom and look at a new Nissan, I tend to assume that the engine was also built by Nissan.  This might just be because I’m old.  (I can remember a friend telling me back in the * 1980s * that his new Dodge minivan came with a * Mitsubishi * engine.  And that was quite a while ago, wasn’t it.)  In the PC-World, however, the analogy is even poorer.  I’m a long way from being an attorney, but I tend to think that if you examine EULAs and legal agreements, you’ll find that Hewlett-Packard or whoever’s name is on the machine absolves themselves of responsibility as to whether * any * operating system will properly run on it at all.  Even Microsoft WINDOWS.  The reasons for this are probly fairly obvious, if you think about it a little.
So if you get a computer with Linux pre-installed, I’m not so sure you ought to expect it to work any better than if you were to just start playing with free Linux, like the distros you can download and burn as a live-cd yourself.  I would suppose you’d be better-off if you didn’t let some company or computer-shop from whom you bought a Linux pc know that you are aware of this fact, though—because you may still be able to make them take care of you if it somehow goes awry through no fault of your own.  But that goes without saying.
There are other pros and cons associated with installing Linux yourself, some of which I will get into further in this document, or in other documents I have posted.  But really, I’d have to say that many of the headaches can nowadays just be avoided by installing to a USB thumb, and running your Linux from that.  There are other ways too, like the Virtual Machine method, or using an add-on, external harddrive.  More about this as follows.
Some people call Linux “obscure”, but it really is not.  Google’s Chrome OS and Android, after all, are based on Linux!  There are operating systems out there which really * are * “obscure”, like MorphOS, or REACT.  Or Haiku.  I do not mean to disparage anybody’s favorite os.  I just want to illustrate a point.  (Read some of the comments I get, though, as to “obscurity”.)
Until recently (circa 2001), Linux was more of a big thing in the server-market, as opposed to desktops (desktop, generally, = “end-user, point-and-click consumer system”).  It still is.  Arguably just over ½ of all ** server ** computers * in * the * world * are Linux-boxes.  And the entire Google corporation runs on nothing but ** Linux **!  But to-day it’s becoming a competitive desktop.
If you’re starting to think it’s too involved, let me suggest this:  why don’t you try re-installing WINDOWS a few times, like on computers that had to be re-formatted, because they got a bad virus/trojan/worm?  Yeah, sometimes it goes real smooth.  Sometimes it does not.  Sometimes WINDOWS’ networking isn’t working after you get it re-installed, and for no explicable reason.  Or it won’t boot, and you finally have to start over.
Linux, on the other hand, is immune to about 98% of all viruses, at the time of this writing (Autumn of 2011).  What can happen is a root-kit, or certain other Linux malwares:  but this is mostly targeted at ** servers **.  And there are ways to make your Linux secure (relatively speaking, of course) against even these.  Linux can be set-up to be more secure than WINDOWS, and this is just pretty much of an immutable fact.  It is also true that WINDOWS root-kits are even more prolific, and arguably harder to deal-with—because the only effective means to deal with one is a complete re-format and re-install.  Reformatting & re-installing is actually easier to accomplish in Linux, ESPECIALLY ONCE YOU HAVE HAD THE FIRST INSTALL WORKING.
Or you can just NOT INSTALL Linux.  Just about every Linux distro to-day can be run just fine from a live-cd (or a bootable USB-key, unless your computer is old enough to have a BIOS that cannot boot from USB:  see “universal USB installer” at the site pendrivelinux.com), without ever installing it to a harddrive.  WINDOWS can’t run this way [or it’s * bloody * difficult * to * do *—unless you use a VM-arrangement:  and anyway such “live-booting” of ms WINDOWS probably violates Microsoft’s EULA (End User License Agreement)].  And when a Linux file-system is burned to a cd (or DVD) as a closed session, the file-system can no longer be physically altered.  So a live-cd’s file-system CANNOT ITSELF BE HACKED.  This statement is also probly as true for Linux booted as a live-filesystem off a harddrive partition, instead of a cd or DVD.  What might be possible respective of live-Linux is that somebody figures-out how to hack the files that are “floating-around” in the RAM-disk.  But this would be hard to maintain, as most live distros “churn” from the cd-tray, calling-forth the system-files that are needed to accomplish the task-at-hand, and sending the virtual files-copies used for the previous task “into oblivion”.  What any live-Linux cd  runs in the RAM at any given moment is just a “mirror-copy” of the “real” system-files—these on the shiny surface of the cd.  And the surface of a cd-R, burned to a closed-session, cannot be changed.
The Slax distro will even let you use a  program that they make available in their server, which will construct a system for you to download, and which will be created to your specifications (as long as they are do-able), so that you can substitute more up-to-date versions of the usual bundled apps.  There is even a video tuto on You Tube, but I’m not so sure Slax would be among the best for us beginners.  There’s a way to do it with Ubuntu, too.  You login to the server, take the time to make your customization selections, finalize your “order”, and it gets placed in the “queue”.  The super-computer then “builds” your desktop Linux operating system, and saves it in the form of an .iso file.  In a few days or a week, when your custom .iso is ready to download, you’ll be notified by e-mail.  Then you can download the result.  [Google it:  i.e. “make my customized Ubuntu on the web”, or “web services that create a customized Ubuntu for you”, or phrase it something like this.]
NOTE that any live Linux disc (cd or DVD) is gonna be  S – L – O – W – E – R  than when/and/if Linux is installed to and running from some other media, like an internal harddrive, or an external/add-on harddrive, or even a USB thumb-stick.  Not having enough hardware resources/”computing-‘horsepower” can also cause Linux that is being tried from a live cd to be slow, or to “hang”.  This is just something you have to suss-out.  Check the “system requirements” for the Linux distro in question against the hardware (CPU model and clock-speed, amount of installed RAM) of your computer:  see the official website of the distro’s maker, for the hardware requirements.  Puppy Linux, however, is rather an exception to this, because Puppy is one of the comparatively few distros that is small enough to have ALL of the File System loaded into the computer’s RAM as the boot process finishes—and Puppy is pre-configured to do so.  It is because of this feature that Puppy Linux will run just as fast from a live-cd as from any other boot-media.  There are other distros that are like Puppy in this way, but they are comparatively few.  KNOPPIX also—though it does not load all of its FS into the RAM—is somewhat exceptional in this regard, as it normally runs as fast from a live-media, and the creators discourage harddrive installation anyway, because KNOPPIX is made to run from a disc or a thumb.  A VERY IMPORTANT NOTE/UPDATE TO TACK-ON HERE, IS THAT MOST USERS (by the time I’ve had a chance to insert this) ARE NOW PROBABLY SKIPPING THE * BURN * THE *CD * PART, BECAUSE THE USE OF WINDOWS-BASED PROGRAMS LIKE UNIVERSAL USB CREATOR from pendrivelinux.com, et al (free!), ARE INCREASINGLY PERFECTED, AND SO YOU CAN TRY-OUT DESKTOP LINUX FROM A BOOTABLE USB KEY YOU MAKE IN ABOUT 15 MINUTES, AND IT WILL RUN AT VERY NEAR FULL-SPEED!  This is now a * better * way to test-run desktop Linux, and it will usually install just fine from the USB-drive, if you decide to do so.  Further, it usually won’t hurt the USB-thumb drive (“pen-drive/jump-drive”), and you could re-use it later, for other stuff, just by re-formatting it back to fat32.  One caution about this method, though, is that you do run (some) risk of formatting the Linux distro to the wrong drive, if you’re   a) really inexperienced, or   b) really tired.  So be careful!  Or just stick to trying desktop Linux from a cd/DVD.  CDs and DVDs are still good, though, to make backup-copies of all your text, documents, batch, audio and video files (and your operating system(s) too), once you’ve installed the systems and programs you want to your disk.  Always make backup-copies of your Windows operating system on cds/DVDs as soon as you get a new (or used) computer, and then once again before you mess with Linux, BSD, or any other system.

ubuntu1004_desktop2

Ubuntu 10.04 LTS in action.

I am not a “steam-punk”, though I admire the creativity of this sub-culture.  But you know, PCs are a lot like the old steam locomotives that plied American railroads.  Laptops especially.    Having spent some of my life around railroads and trainmen—some old-timers who remember the last of the steam-era (1940s), and some my own age who worked weekends on “scenic railroads”—I have been told more than once that the steamers were said to “be alive”, to “have a personality”, and that “no two were alike”.
This is especially true of laptops and mobile-devices, which is how we most often compute to-day.  Laptops are more complex, as a general rule, than other hardware platforms.
And no less than the old steamers with their developed complexities and inherent flaws.  Look at a photograph of the inside of the cab of one of these steamers of say, circa 1920 or so—what the engineer saw and had to understand, as far as the controls.  The complexity developed by the steam locomotive technology of that era will * surprise * you.  And just like these “beasts” of old, laptops can be * cantankerous *.
My experience of Personal Computing, thusfar, is that “The Theory of Personal Computing is ** NOT ** the Practice”; and, “The Praxis does not always live-up to the * Supposed-Possibility *”.  Even in MS WINDOWS.
Chat-rooms and MMORPEGs seem dominated by trolls, rather than by the participants.  [Though I find no “trolls” in Linux help chat-rooms, and very few in the forums; perhaps this is owed to the sheer difficulty that * operating * systems * pose to us mere mortals, as a general rule (?)].  Some office and productivity operations on even WINDOWS 7 seem to fail for inexplicable reasons, even moreso if third-party software is involved—even though it may be reputable, and I have researched the !%@#^% out of it before downloading and installing.
Other things seem to “refuse to be accomplished”, even though Accepted Theory says that they should be easily executable.
To cite just one example, I went to the package-manager in LMF 9 (Linux Mint Fluxbox)  when I had it installed to a USB thumb and successfully running from there as a traditional hdd-type install, having used Mint’s own installer.  I installed Open Office Writer from the repos, and it said that the install was successful.  I had researched this beforehand, and was assured that I would be able to install just the documents-creator, and not have to get the entire office-suite.  But when I tried to open and use it, I got a message saying it was “not in the system”.  I had adequate space on the USB thumb, and could surf the web from it, and do other things.  Maybe this file got deleted from the 9-series repos.  I could have messed with it further, but due to day-job constraints and some other things, I sort of ran out of time.  A little later-on I over-formatted the thumb with another distro by mistake, probably because I had too many projects going at once—and I was short the money to go and buy a couple more thumb-keys, which is what I needed.
Yes, some of this is my own inexperience and ignorance.  But not all of it.
Just look at how “buggy” the software you use at work is, whether it is Windows or something else.

And then there is Linux itself.
Desktop-Linux, my friends, is not just a good free operating system for microcomputers; it is a kind of ** religion **.  Or at least a sub-culture.  Some might even use the word “cult”:   but that appellation has negative overtones.  To my mind, there are cults, and there are * cults *.   Aum Shinrikyo (which in the 1990s released some nerve-gas on a Tokyo subway train), the Heaven’s Gate suicide-pact we had here in the United States at about the same time, and the Jonestown tragedy in Guyana back in the late 1970s:  these meet my definition of * cults *.  Then there’s the other kind.  The more benign, useful kind.  Goth culture. Aficionados of the Rocky Horror Picture Show.  Doctor Who fans.  World of Warcraft.  Some even say Alcoholics Anonymous.
As far as I’m concerned, “Linux-religionism” (a.k.a. “Stallmanism”, after Richard Stallman) falls into the latter category, if it belongs in one of these categories at all.  Linux may be a “cult”, in the sense of a zealous following of hardcore and committed users that emerged after the fact; after the creators of GNU/Linux released their work, and it became public.  So I guess we should properly say that Linux is not a cult, but that it has a cult * following *.

SO IF LINUX IS SO GREAT, WHY ISN’T IT MORE POPULAR?
The best and most concise way to explain this is probably the following:
[Let me interject here, insofar as my following words, that  1) many of the “problems for which Linux is ‘notorious” (i.e. Flash-video functionality, “dependency-issues”, et al) have been fixed in recent years, particularly in the major distros (Ubuntu, Linux Mint, Fedora, KNOPPIX, PCLinuxOS, &tc.)  And these fixes seem to have been very effective, indeed.  Perfect?  NO.  Flash video will still be rough on * some * laptops where you’re using desktop Linux—though nowadays this seems comparatively rare.  MACs, by the way, can have similar difficulty—but then MAC is also UNIX-based, just like Linux.   2) Some of the other problem areas are more difficult to address (i.e. peripheral device compatibility, some printers, some BIOS boot-issues, good equivalents for certain of the less-used ms-Office suite programs.  I will say right here that I have yet to experience a situation where Ubuntu fails to recognize a USB mouse, USB keyboard, or multi-port hub; but I’ll add that Ubuntu 9.10 did not recognize my friend Jim’s * wireless * keyboard—though the wireless mouse was recognized.  I am * not * trying to be in the business of sowing F.U.D. (“Fear, Uncertainty, and Doubt”) about Linux:  I really like Linux, and I really like its concepts.  But I feel one of us ought to be more frank about the “facts on the ground”, so to speak.
A few of the drawbacks might be ascribed to Linux itself; and many of these are * ”social” *, rather than “innate”—i.e. Linux programmers (who are often doing it for free—or, more to the point, for their own fame and sense of accomplishment—which is just fine):  well, these are not always disposed to creating some utility that is lacking, but which just “isn’t ‘sexy”.  In terms of developing, say, a GUI front-end for some not very often done operation, which most Linux “initiates” seem able to do from a command-line anyway.
Other of the “drawbacks” can be put squarely in the province of the hardware makers, who have little incentive to make their products “Linux-compatible”, Out-Of-The-Box.  Yet a surprising number of them do just that.  And there are often simple hacks and work-arounds available on the web, sometimes available even at the major hardware vendors websites.  So maybe the hardware makers are trying to tell us something. (??) ]

Linux * is * a better system, in many technical ways.  But most of these would not matter to the average person, who just wants their computer (laptop, tablet) to just work, when they want to do something with it.
Even so, I have to say that at this juncture, my thinking is that Linux will always be a little harder to use than the two main commercial platforms (WINDOWS and Apple/MAC).  Why?  ** Because it’s not bundled software ** (except in some comparatively rare circumstances).  Linux seldom comes as the default operating-system on any computer you buy, for ordinary home- and on-the-go use.  Yes, there are some.  But most machines come with WINDOWS (or MAC).  And why is this?  Probably because the other two are for-profit concerns, in the end-user market.  Linux, on the other hand, is only a for-profit concern in the server and industrial markets, and this only in certain circumstances.  (Yes there is ANDROID; but * that * I often feel has created its own category.  I guess you can flame me on it if you wish.  What am I, an IP law professor?  I’m lucky if my socks match, I was twelve before I figured-out camouflage isn’t a “real color”.  And the toes of both my feet are webbed.  If I am at a loss in understanding the machinations of all this software licensing stuff, maybe I’m not alone.)
Anyhow, all this goes back to the formation of UNIX, which was the main-frame operating system upon which Linux is based, and which is still in use in the computer industry today.
UNIX started out as MULTICS, a for-profit, commercially created operating system for big mainframe computers, which used to be the size of the building in which my apartment is located.
In the late 1960s/early 70s, there was only * one * American telephone company.  To-day, at the time I write this, such a state of affairs is inconceivable.  But it * was * the case, for most of the history of telephone communications in the United States, up to that point.  One corporation—the American Telephone and Telegraph Corporation (“AT&T”)—had been granted a legal monopoly in the telephone market of the United states.  This was agreed-to by the United States government, and the Federal Courts:  it was in view of the fact that telephones of the day operated only by means of a grid of * electric wires *, which spanned the nation.  Radio-type voice communications between ordinary citizens were confined to very specific sections of the EMS bandwidth, and were also tightly regulated, as cellular technology didn’t then exist.  But because * every * telephone call made during this era was accomplished on just * one * network (the telephone infrastructure of AT&T),  a very great number of people then observed that the United States had the best telephone service in the entire world.
I come from that era (telling my age here), and yes, I tend to agree that the level of service we all received under the AT&T monopoly was remarkably reliable, and pretty sane.
What changed is that new technologies came along, which offered the possibility of increasingly greater customization, and new ways of using the existing network.  Deaf persons, and people challenged in other ways needed better access to the telephone system:  and that is one of the advantages which the new technologies offered, but which the telephone monopoly was supposed to have been slower in implementing.
In the late 1970s, the Federal Courts finally broke-up the monopoly, and American Telephone & Telegraph became just another player in the North American telephone market.
But cell phones were still a way off.  And it was during this era of the “hard-wired network” that computers became important to what was then “the telephone company”.  Because “the telephone company” had to switch thousands and millions of calls every day.  And the load expanded geometrically, with the passage of time.  What in 1946 would have been considered “state of the art” equipment—plug-boards and electro-mechanical switches—was becoming woefully inadequate by the 1960s.
And so, in or about the late 1960s, AT&T entered into a consortium with some other corporations, to develop a software-system to do what was then more complex tasking in a main-frame computer.  The operating system they began to develop became known as MULTICS, because it was to be a “multi-user” system.  What that means, essentially, is that more than one user could be doing something on the mainframe computer at the same time.  Multiple user-accounts.  Time-sharing with separate users.  What we today would call “multi-tasking”.  This would help the phone company keep up with demand.
And it was a good idea, generally, because t would make large, main-frame computers more useful, generally speaking.  Yes, there * were * other software systems around at the time, which could have more than one user.  But AT&T and its corporate partners wanted an improved system.
The project went on for some time, and made headway.  But for various reasons—perhaps known only to the then senior management at the phone company—AT&T gradually withdrew from the project.  MULTICS seems to have faded into ignominy.  Seeing what was about to happen, some young engineers at AT&T’s Bell Labs (now Lucent Technologies) did not want all the work they had done to go to waste.  So they re-wrote the software, and used it for their own purposes.  Like playing a computer game one of them had written for the system, to amuse themselves.  One of the wits in the department jokingly began to call their new creation “UNICS”—perhaps because it was “unique”.  In any case, the name stuck, and soon people started to spell it “UNIX”.
UNIX made its way to various main-frames, and to university departments.  This was probly via tape-drives and other portable-media of the day, as what we would recognize as modern computer  networking did not yet exist.  UNIX would often be co-installed to a main-frame computer, as a kind of “after-hours operating system”, along with the official operating system the main-frame had.  The “geeks” who knew how to use the main-frame computer used UNIX for their own purposes, or to get work done.  Sometimes people would notice that UNIX was actually * superior * to the “official” operating system, that was loaded on the machine by the owners.
Then around 1990, a computer-science grad-student named Linus who was in Finland learned about UNIX, and decided to re-make it into a smaller version, small enough to run on his personal computer.  Because he saw how much better UNIX is structurally, compared to WINDOWS/DOS.
NOTE that you probably do not need to learn that much at all about the UNIX/Linux file system structure, in order to start using Linux:  all the files-manager programs to-day are graphical, so you don’t need the knowledge equivalent to some CS-major who’s writing a term-paper on the structure of UNIX.  A very rudimentary understanding (if that) is probably all that is necessary to-day, and it can come later.  The Linux desktop operating systems I will discuss here come with full suites of applications, already installed.  (With the exception that Puppy usually does not come with a full-blown office suite, exactly—though this is remedied easily enough.)  Most of the computer users I know are completely in the dark, as to the file-system structure of WINDOWS.  They just know how to download something, and then to run the installer-wizard with which it came, in order to install an app.  More about this later.
If you’re one who feels you just * have * to know about the filesystem, here’s the pretty basic drill:  [If you’re not curious about the file system, you could just skip-down about 6 paragraphs, to where it says “Anyway, Linus……”  in bold type.]
Microsoft WIDOWS is based upon the old CP/M operating system dreamed-up by Gary Killdall and modified/reverse-engineered by Tim Patterson back in the 1970s.  Much like UNIX/Unics/Multics (the antecedent of Linux), CP/M / QDOS / MSDOS / Windows has an inverted-treelike filesystem structure:  however, in Microsoft/CP/M type systems, there is not one ultimate “root’ of the upside-down “tree”:  instead, you have stuff like “C:\, D:\, E:\”, &tc.  There used to be A:\ and B:\, too—these were (usually) two floppy-disc drives in the front of the computer case.  If you’re thinkin’ “what’s a ** floppy ** disc * ??”, you can Google it.
In UNIX/Linux/BSD (and other UNIX-y / POSIX-y systems), we have Root directory—a kind of “folder of folders”—or, if like myself you hail from the first Gulf War era, the “Mother of all folders”:  just a huge starting-point on the filesystem disk, from which * everything * else proceeds.  Apple/MAC, by the way, is also a UNIX-y / POSIX-y system—at least since the release of OS/X.
Linux is really just a cut-down re-hack of free UNIX, which was around in computer science departments (of universities) in the 1970s.  A (then) student named Linus copied a version of it, “re-mixed” it, and this became Linus’s UNIX, or “Linux”.
Remember * Linux * is the * kernel * which your favorite desktop Linux distro (Ubuntu, KNOPPIX, or whatever else) uses to run on-top-of.  Just like WINDOWS is a “software stack” that runs on-top-of the NT kernel. (Back in the old days, WINDOWS used the DOS kernel).
In Linux, we use the / (forward slash) to separate the levels (branches) of the folder-tree.  In the Windows/DOS world, the backslash \ is used.
“Root” in Linux is often represented as just a forward slash (  /  ).  But really, it is in fact the (imagined) empty space * before * the forward slash, that is the “Root”.  Engineers knew there would always be a root, obviously.  So it was decided not to waste assigning a special symbol for what would always be assumed to be present, on a UNIX system.
Root in Linux is the equivalent of an Administrator account on a Windows system, more-or-less.  More about this in the numbered entries below (though I try to make it very easy for us to understand).
Most of the major desktop Linux distros default to making you run the day-to-day operations of the system as ordinary User, occasionally invoking the system privileges that the Root account holder would normally only hold, in order to install a software package.  Or perhaps to change the partitioning layout of your harddrive, or a peripheral disk, such as a USB thumb-drive.  By contrast, many people run their WINDOWS 7 computer as “Administrator”—which is the WINDOWS term for the Root account on the system.  People do it this way because they remember that in WINDOWS XP, many mmorpeg games and certain other games or apps wouldn’t work, if they logged-in on an ordinary user account.  Well, Microsoft has fixed most of these issues in WINDOWS 7, so running the computer as Root/Admin is a bad practice.  It makes the system much more vulnerable to viruses and spyware.  Even so, certain games and apps still won’t work from a User account in Windows—not even in Windows 7.
Ordinary user accounts on a Linux system can (temporarily) invoke Root privileges without switching users, like by using text commands like sudo (“Super User DO ~ something, some task”), or su (“Super User”).  You have to type stuff like this as lower-case, unless told otherwise.  MS DOS doesn’t care if it’s upper or lower case, or mixed, when you’re in a “DOS-window” (DOS-emulator).  Which almost no Windows user ever uses anymore.  UNIX (and therefore Linux) separates these two alphabets, and therefore twice as many commands are possible with the same number of letters.  So we say that Linux commands are case-sensitive:  it matters to the Linux system whether you typed a command as lower-case, or a capital letter, if you’re trying to use a non-GUI command-window in Linux.  But this feature can easily be bypassed by just adding the “ -iname” option, and case-sensitivity in the various GUIs/Desktop Environments is usually turned-off, by default.
I think sudo is usually pronounced “sue – dew”.  But I’m a rube (sort of like a Bogan to you Aussies), so I call it “sue – dough”.
The sudo command is installed to Ubuntu and most of its variants/remixes by default.  Where it comes to other types of desktop distro, it may also be present by default.  Or one may have to install/configure it, but this is not that hard.
Further, when we graphically install a software package from Ubuntu’s Software Center, and we type our password into the authentication manager, we are in effect using sudo, but we are doing it graphically.  So really, by this time in desktop Linux’s development, all this sudo stuff is almost academic.  I have not had to open the Terminal and use the sudo utility on Ubuntu 10.04 since I installed it.  I just do some things this way once in a while, because I like to see how things work.  And it does seem to still be true that if you have downloaded some other Linux distro with a Linux desktop, that there is still no graphical way to confirm the hash calculation graphically, so one has to do so from a command-window (“Terminal-emulator”).  Whereas there are a number of free programs to do this in Windows XP and 7.  In Ubuntu, to test the hash-integrity of some new Linux distro I’ve downloaded, so I can burn it to a cd (or make a bootable USB thumb-key)—so I can try it out, and see if I like it—I find I still have to open a Terminal window, and navigate to the Downloads directory, and type “md5sum”, a space, and then the exact name of the .iso file, and hit Enter.  But this is not so much harder than doing it graphically in Windows—like with HashCalc 2.02, or any number of other such apps.

HERE’S A BIT OF A RUN-DOWN OF LINUX’S FOLDER-TREE / DIRECTORY-TREE, if you’re interested:

Directory    Description
/bin    Essential command binaries
/boot    Static files of the boot loader
/dev    Device files
/etc    Host-specific system configuration.  Back when the UNIX file system was being created, the engineers didn’t want to spell-out “miscellaneous” or even “misc” every time, so they named the catch-all directory “etc”, for “stuff that doesn’t seem to fit the categories of any of the other system directories, ‘etcetera’”.  Unless you’re using some distro like maybe MoonOS, the /etc directory is essentially an inhuman mess, full of all kinds of stuff, often in no intuitive order.  However, this is (in a way) part of Linux’s charm:  you might have to be an expert on this directory if you wanted to write * certain * kinds * of malware for the Linux platform.  And /etc, being arcane, is therefore (at least somewhat) an impediment to those who would malware and/or virus Linux, making Linux more secure.  [Well, maybe not nowadays:  by now, there are other ways to malware a system, because of the ways in which the web and cloud-computing have develped in the late 2000s, and the different ways we as users use the internet now.  But in historical terms (if not also contemporary), I’d tend to say that the arcana of /etc has been one (passive) way that Linux has defended itself.]
/lib    Essential shared libraries and kernel modules  A lot of your Linux’s “dependencies” (rough equivalent of Microsoft’s “dll” files) would be in /lib.  These would be libraries (libraries = “lib”, get it?) of stored routines which various applications (“apps”, like Open Office and VLC media player, et al) share, and use to do their work.
/media        Mount points for removable media, like your little collection of USB thumb-drives.
/mnt    Mount point for mounting a filesystem temporarily.  This can mount the * software * (files, i.e. “file system”) that resides in a USB thumb-drive, after the actual * hardware * (that rectangular plug on the end of the USB drive) is mounted and recognized (see above entry).
/opt    Add-on application software packages
This directory may not be present on some systems—especially very spare, lightweight distros.   /opt/ is where (usually third-party and/or binary-only) additional (read: not part of the install/distro) utilities and applications go. Examples would be office suites, binary-only browsers, etc. This functionality is sometimes shared with /usr/local/, although the latter should really be used for locally-compiled apps, according to many purists.

There are usually accompanying directories /etc/opt and /var/opt.
/sbin        Essential system binaries.  The “guts” of your distro, more-or-less.  The basic code that makes everything else work, I guess.
/srv    Data for services provided by this system
/tmp    Temporary files
/usr    Secondary hierarchy      Secondary hierarchy for user data; contains the majority of (multi-)user utilities and applications.
/var    Variable data  such as logs, spool files, and temporary e-mail files.
Anyway, Linus made himself a personal UNIX system, which was a little easier to do by then, because of advancements in computer science, generally.  He shared this with some geek-friends of his, because it didn’t cost him anything (except his labor, which he did not care if he re-couped).  His friends began calling this “Linus’ UNIX”.  And before long the two words became run-together, and Linus’ UNIX became known as “Linux”.  (Linus + UNIX = Linux).  [No, this description of events is not exact.  It is my own synopsis, in order to avoid making this longer than it already is.  But it is * conceptually * correct.  You can write your own, if you wish. ]
Meanwhile in California, Richard Stallman’s GNU-Project had been hard at work trying to produce a similar type of “open” operating system for personal computers, and for some time.  But they started at the other end, for some technical reasons.  Instead of starting with the kernel of the system, they decided to leave this hardest part for later, and planned to do it in a different way.  Linus beat them to the punch with the Linux kernel, and, as he had publicly posted it, people started using it.
But this new Linux needed apps, in order to actually do a wider variety of stuff.  So people began to combine the apps from the GNU-project with the kernel of Linux, which are both POSIX-compliant paradigms.  That is why it is properly called ** GNU/Linux **.  Because Linux would not exist without the GNU Project.  But most people seem to just continue to call it Linux.
I don’t know much as to exactly how accurate this rundown is.  It’s probably the most crude rendition of what really happened.  If you care that much about Linux’s origins, you can read the flaming I will probly receive in the comments section (below).  If, like me, you just want a free operating system to just * use *, and which is actually a good system (for a change), read-on.
So Linux (and its cousins the BSD-flavors) have been, for most of their time on this earth, freebies.  (The BSD-systems are a more complicated history, and beyond the scope of this article.)  Free to obtain, and free to use.  Free to copy and give to friends and others.  But not free to resell for money (generally:  if you want to know more about exceptions to this rule-of-thumb, consult your copy of the GPL/GNU public license, or learn about the various software license-types, or perhaps consult an intellectual property lawyer.)  And this makes Linux different than the commercial platforms.  The major computer hardware makers (HP, Dell, Asus, Lenovo, &tc.) would probably love to switch to Linux.  They could even create their own versions, and add some more user-friendly tweaks.  All-the-while leaving their systems open enough so that Linux-UNIX softwares written by third-parties (the community-at-large) could still be installed to your computer by you, the owner, without a huge hassle.
I guess there are two (at least) problems with this idea.  1) It would require the big computer makers (“hardware vendors”) to co-operate, to some extent.  They probably already * do * *co-operate *—to some extent.  And more than we are generally aware.  But they have to watch-out for American anti-trust laws, which can be vague in some areas—so it may be hard to tell if some action of co-operating with your competitor is going to be okay or not, before the fact.  And they’re in business to * compete * with the competition—not to co-operate.  2) They are dependent on Intel (for the most-part) to make the CPU-chips.  Intel dominates the CPU market, because their technology is ahead of everybody else’s.  And they make sure it * stays * ahead.  And Intel has a close relationship with Microsoft.  More to the point, the vendors of various-and-sundry circuit-cards and “peripheral devices”—pieces of computing hardware that everybody’s system needs in order to get things done—well, these companies have a mixed track-record where it comes to co-operating with the programmers (“developers”) at Linux, so that they can make sure that the drivers for these devices will * work *.  And so the status-quo is difficult to change, once it is in-place.  Yet I will hasten to add that a surprising number of printer manufacturers and web-cam producers, and the makers of other peripheral hardware ** do ** co-operate with the Linux community, and help produce Linux drivers that work.  So maybe the major hardware-makers are trying  * to * tell * us * something *.
Thus we see that a very prominent obstacle to Linux’s adoption—perhaps the greatest obstacle—is just that Linux is ** not bundled-software **.
The WINDOWS interface is part of the status-quo too, isn’t it?  And the general way WINDOWS works, which still differs in some respects from the way Linux handles certain things, despite Cannonical, Ltd.’s best efforts to make a Linux distro that’s more amenable.  Indeed, the underlying ways Linux works does not need changing, because it * is * better, generally.  But in superficial ways, I still have to think desktop Linux could benefit some more by making the “feel” more comfortable for us Lamers, coming from a WINDOWS environment.
Technology can be daunting enough, if you’re a non-Geek.  So I’d really like desktop Linux to dumb-it-down just another notch or two—but mostly in a                * SUPERFICIAL * way, I emphasize.  Leave 99% of the underlying structure alone.  Just make some better GUI-point-and-click utilities, and ones that really * work * reliably.  This * is * being done, even as I type this.  But it’s still somewhat of a game of catch-up-ball.  Ubuntu is gaining, though.  The 11.04 and 11.10 releases have been a rough-patch, as the new Unity desktop-environment gets it’s legs.  WINDOWS had an even worse era than this, from roughly the initial release of WINDOWS 98 (which wasn’t so hot), through to WINDOWS Me (a disaster), and culminating in the initial releases of XP, which early-on had very few drivers and could be buggy as hell.
But all that was a long time ago—at least measured in “computer time”.  And it does not seem to be part of our cultural-memory.  That’s how used-to XP people got.
There is considerable Hope, however, that this situation (which I have probly made to sound worse than it really is) is likely to improve, in the coming months and years.  And this would be, if for no other reason, because Apple has abandoned its historic relationship with the “Power PC architecture” (which was Motorola-made), and gone to none-other than * Intel * for its CPUs.  And Apple’s OSX is really no more than a consumer-oriented re-hack of BSD—and BSD is a first-cousin to Linux.  All of these “*nix-type systems”—at least in their major incarnations—are POSIX-compliant:  Linux, the BSDs, Solaris/SunOS, Darwin, UNIX, and yes, MAC OSX.  And so because Apple has made the decision to port their operating system to Intel architecture, reason would seem to indicate that Intel CPUs would tend to be manufactured with * nix-compliance progressively increasing beyond what is already there.  I guess we’ll see if this turns out to have any positive effect or not.  Perhaps MAC is still sufficiently different that it will have no positive bearing on Linux-compatibility with Intel’s various components.
I need to add right here, that 1) a huge number of smart phones, tablet computers, and other small mobile devices DO NOT USE THE INTEL CPU CHIP—not even a Celeron—but utilize the ARM, which was/is Britain’s answer to Intel; and 2) * Microsoft *, no less, has announced recently that they are going to code the WINDOWS 8 desktop so it will run on * both * the Intel and ARM.  Of course there are people who * have * been running WINDOWS on ARM for a very long time.  But I’d tend to bet they’re not people who’re afraid to get their hands dirty.  ARM chips are RISC-oriented (Reduced Instruction Set Computing), and therefore consume less electricity.  And so not a few Linux developers are out there, working on ways to make Linux perform really well on ARM architecture.  ARM, really, is basically the equivalent of a * phone *-chip.  A very significant advantage Linux seems to have over WINDOWS, is that it seems to be much easier to port Linux to hardware platforms with low processing-power.  Even with the capability to use a GUI/point-and-click interface.  So I guess we’ll see whose system does better with ARM architecture.
Desktop Linux is just not that hard—at least not by the time of this writing, in late 2011.  Especially if you find the right information, which I have tried to make easier here.  A good book about Linux, and a geeky friend (or joining a LUG) are also marvelous facilitators.  Learn the UNIX/Linux vocabulary.  Google-around.  I will emphasize right here that you absolutely do not have to learn how to program, or to understand the difference between open-source software and proprietary software, in order to use desktop Linux.  That’s just more than an ordinary desktop user would need to know, in order to do most things with Linux.
I migrated to Linux on my new laptop (a Toshiba L515) all by myself, with no outside help, except that which I found available through my broadband connection.  I guess it could be worth noting that I was recently laid-off from my job (construction), and so I had a lot of disposable time.  I knew little about computers (or rather even less than I know now, which is infinitesimally small anyway), and I * did * use-up a lot of time.  I post the benefit of my experience here (however small it may be), in hope that it may help others, and that you will not have to duplicate all my research, or repeat my mistakes.  I now have Linux running on two other of my computers.  I will qualify this by saying that I have had to run it on my lappy with an acpi=off Grub file edit, so there is no battery management while I’m in Linux; however, this laptop almost never leaves my computer desk, so this is a “wash” for me.  When I get ‘round to it, I’ll try some more “granular” boot-params, and maybe I’ll find one that will let me run this lappy with battery management in Linux.  I might try giving the compile-a-custom-kernel option a go, as this has also been recommended to me.  As it is, I am mostly using my new netbook now—an AAO with a 10.01” screen, and I’ve never encountered the ACPI problem on this machine.
I’ll add here that, while I’ve been able to make the Ubuntu 10.04 harddrive install I did on my full-size lappy usable by means of a few tweaks, I have never compiled a custom kernel for it, which (apparently) is what this Toshiba lappy needs to run Linux without an occasional freeze-up—especially when opening some webpage with a lot of formatting.  There are a few relatively obscure codecs I should download and install, also.  That might help.  I am aware that these two things are probably the culprits of my remaining freeze-ups because of information I found on the web.  I may get around to them this Winter, by I probably won’t; my real (Linux) project is my AAO netbook.  Winter will be here soon enough, and the rest of my workload will dissipate, along with my day-job.  Until then, I’ll probably continue to use the Ubuntu partition like it is, remembering to click File > save in Open Office Writer 3.2 after every paragraph, which has been very effective in helping me avoid any data-loss.  This not only saves the new copy to the file/document itself, but also to my Dropbox account.  This is a healthy practice anyway, no matter what platform you’re on—WINDOWS—even Windows 7, MAC, Linux, etc.:  you can still lose a whole morning’s work by not saving frequently.
Deleting the side-bar comments from .doc s into which I have pasted formatted stuff in OO Writer also speeds things up a great deal, and I’ve never had occasion to use the “comments” for anything anyway.  Just horizontally scroll to any one of the comments, then click the little drop-down arrow it has with it, and click “delete all comments”.  (Better yet, I’ve found it a great improvement to use any of the various screen-shot tools available in Linux—GNOME screenshot tool, or the distro-independent tool “Take Screenshot”, or XFCE’s “screenshooter”, et al.  This turns the hyper-link image I was about to paste into my document into a .png or .jpg (or other format of your choosing), and I can then use the Nautilus file-manager to copy and paste the .png, .jpg, or whatever into the document.  This makes the document much easier to open, to scroll-around in, and it will still display the image even if the computer that gets the file isn’t connected to the internet.)
The install to my good lappy went off without a hitch (Ubuntu 9.10; the machine came with Win 7).  As I have indicated elsewhere in this writing, I chose a dual-boot arrangement, which lets me boot the computer into either WINDOWS or Linux.  (It is wise to boot WINDOWS before doing this, and run the defrag-er.  When that gets finished—even if it takes half a day—then you are free to continue.  But I’ve never had the de-fragger take much time at all in Windows 7:  Windows 7 “keeps house” better than XP did, without attention from the operator/user—unless perhaps its defaults have been changed.)  Be aware also, by the way, that some machines have great difficulty hosting both systems, in which case you should probly seek a thumb-install, which I have written about elsewhere in this document, and in this database.  It is also true that a lot of people run Linux as a “virtual machine”, using virtualization software, such as VMWare, Virtual Box, and others.  I suppose I should add right here, that there are some services that are web-based/cloud-based, which will let you state the customizations and softwares you want to add to, say, Ubuntu, Slax, or perhaps Fedora, and then put your order into a “queue”, and then have the super-computer at their end put together a custom desktop Linux .iso for you to download, notifying you by e-mail when it’s done.  I don’t know how well these actually work, and I think they charge you money (except maybe for the Slax one).
Still, this could be an alternative to the way I’ve approached desktop Linux—which is to 1) install it to a partition in a harddrive, then 2) “cook” the distro—adding the settings, softwares, and customizations I want, making final changes if one of these doesn’t work-out well with my hardware—and then 3), using some program like Remastersys to create my own custom .iso of Ubuntu / whatever, and then 4) use, say, Unetbootin to create a “frugal/P.M.I.”-type install of my custom .iso to a FAT 32 partition I’ve made in my harddrive, so that my customized system can be booted as a “live system image”, with any (few) additional changes being saved to a persistent-save partition (which is easy to make with G-parted:  just make a .ext2 or .ext3 partition of, say 2-4 Gb. or what size, then run “e2label /dev/<partition number> casper-rw”).  These peristent-save partitions are often called “persistent-save folders”; but this is perhaps a bit misleading:  yes, you * can * do this “persistent-saving” to a * file * as well—at least that’s the impression I’m under at the time of this writing; but in reality, most forums and such seem to recommend the * partition * method.  Just understand that we cannot, it seems, put very much in the persistent-save thing:  that’s just how it is, at the time of this writing—the more this “persistence” is utilized, the  s – l – o – w – e – r   Ubuntu seems to get.  Which is not what we want.  And so this is the reason for re-mastering the distro first:  re-mastering will roll most added softwares and changes into the new .iso image, and so not make the resulting system-image   s – l – o – w – e – r   to perform.  [See Appendix A of this document]
Why go to this much trouble?  Why not just install Ubuntu to the harddrive, using the defaults in its graphical installer (the Ubiquity program:  Anaconda in Fedora and many “RPM” distros)?  Isn’t this easier?  Yeah, I guess so.  And it’d have enough of the benefits available in desktop Linux that the average user would be satisfied, I’d think.  But I often find desktop Linux is at its best as a “live system-image”:  and what I’ve just described is a way to get the benefits of a “live-system” without having to boot it from a cd/DVD (runs slower), or a USB thumb-key [I don’t like to have to fumble with these, when it comes to booting an * operating * system *—not for daily use:  the male-end of the thumb-stick can become bent or corroded, it can get lost, or become corrupted if somebody’s toddler at an airport cafe’ suddenly decides to jerk the thumb-key out of my laptop, and put it in his mouth.
Or I accidentally bump it against the table’s center-piece, because that’s built-into the table at the restaurant or coffee-house where I happen to be, and there isn’t optimal room for my lappy.  Thumbs are fine for backing-up files, and the ability of various GUI “bootable USB-creator” programs to get them to reliably boot many different desktop Linux distros have become increasingly perfected.  But I still prefer the “internal thumb-key method” (a FAT32 partition in my harddrive) over an external USB-key drive.  AN IMPORTANT NOTE, perhaps, for those who don’t wanna touch their harddrive (and perhaps especially their MBR):  there is nothing stopping you from just installing the distro to a USB thumb-key, ** as ** a ** full-install **, using the “Other” or “Do Something Else” option in Ubuntu’s Ubiquity installer:  just make sure that the thumb-key is big enough (probly should be at least 8 Gb.), and that you are ** careful ** to ** point ** the ** install ** at ** the ** USB-thumb **—NOT your C:\ drive.
This renders a bootable Ubuntu thumb which will let you add programs and settings the same way as if it were running from a harddrive partition; you can keep testing how these alterations perform on your individual hardware, and, when satisfied, install Remastersys or some-such, and * then * create your custom .iso image.  Myself, I’d then prefer to copy this custom.iso to another, * ordinary * USB thumb-key, and then it would be easily accessible to a variety of (free!) GUI methods that can be used to make it bootable, to boot from one of my harddrive’s FAT32 partitions.  Or I guess you could just make it into a bootable custom thumb-key, and configure persistence for it later.  It’s up to you.  These USB thumbs are inexpensive enough nowadays that you don’t have to touch your harddrive or MBR * if * you * really * don’t * want * to * do * so *.  More on this subject later, and in other posts.
Having double-clicked on the Ubuntu “install” button, I answered questions, when it put them to me.  The whole process took about ninety minutes, and most of this was no more than Ubuntu formatting and configuring its harddrive partition—which it doesn’t make as large for itself as the WINDOWS partition, by default.  Linux doesn’t need as much space.  As I’ve said, one can use the install-to-thumb-key method, because I have learned how to do that now.  And it has also gotten easier (and perhaps better), because free programs for the purpose have been improved—even in the last twelve months.  Universal USB Installer 1.8.6.3. from Pendrive Linux.com, which you can download to and run from WINDOWS, is a good representative of such a program.  It’s dang near idiot-proof.
[The BIOS of some machines (especially older ones) does not have the option to boot from their USB hardware port/software interfacing.  So older BIOS sometimes cannot boot a bootable USB thumb-key.  This can be overcome by learning to make a special type of “boot-helper cd”.  You could look at the file “l boot from usb when bios is not able”, or Google the issue if it crops-up for you.  PloP boot-manager (I think it’s free of charge) seems to be able to let you do this.  Sometimes this lack of USB-connection bootability can be remedied by just updating the BIOS’ firmware (the miniature operating system that drives BIOS).  This is sometimes referred-to as “re-flashing the BIOS”.  Better let a professional tackle this one, unless you’re already WELL on your way to being a hardware tech professional.  If a BIOS re-flash procedure doesn’t come off right, the machine will probly not be able to boot ** anything ** again, until it is straightened-out.]
I have since upgraded my full-size lappy, by the way, from Ubuntu 9.10 to Ubuntu 10.04.   And I now run several Linux distros on my 10” netbook, multi-booting with Windows 7 Starter (like I said, I’m a bit of an experimenter).
After using it (my Ubuntu 9.10 install to hdd) for awhile, I decided to install it to my Mom’s HP tower computer, because she had great difficultly remembering to look at the site-advisor before opening a web-page, and was in danger of getting malwared in WINDOWS.  I chose Karmic Koala (Ubuntu 9.10), because that’s the Linux I was using initially.  I even forgot, and didn’t download and burn the AMD version, even though I knew it was an AMD computer.  Ubuntu forgave me anyway, even though I used the i386 version by mistake (well, if you read their documentation closely, it basically says you should “try the i386.iso first”—by which we mean * live *, before installing).  It is still running for her to-day.  The reason I don’t upgrade it  to Lucid (like my lappy) is that the default interface in Lucid has smaller buttons, and mama has vision problems now.  Also, as Karmic no longer updates (‘cause it is no longer supported), she does not have to deal with Updates Manager popping up.  I tried to teach her how to run it and install updates, during the last few weeks of Karmic’s support; but she just does not want to learn it, or much else beyond opening FireFox and surfing the web, for that matter.  So far, anyway.  Maybe as time goes on, she will allow me to show her a few other things.  Perhaps not, though:  my mother is quite elderly.  Nevertheless, she is able to surf the web, and to open a documents-writer and practice typing, which she likes to do.  And she has not had to call me for support in months.
I have since installed various Linux to several USB-keys, and I use it on an old notebook’s harddrive, as a “backup workstation”, when my good lappy is tied-up with some experiment I’m trying.
It is also true that neither Ubuntu 9.10 or Linux Mint 10 would successfully work on my neighbor Jim’s custom-built tower computer that he had made at a computer shop, circa 2005.  This machine has 1 Gb RAM, and I think it’s a Pentium 4 or 5.  But maybe some custom mo-board.  I wanted him to try Linux, for the same reasons I installed it for my mother.  Maybe we’ll try it again, if he gets in the mood.  Jim is not in good-health much of the time.  [UPDATE:  we did this, but HIS OTHER MACHINE (with a little more power) had difficulty running Karmic Koala right, and he lost patience before I could fix it.  I later discovered that the problem was probably mostly that his cd/optical tray-drives were not on the same standards as the ones in my computers, and so the cds I burned didn’t want to work right in his machines.  This phenomena sometimes still happens—especially when trying to boot a Ubuntu cd burned on a newer computer from the cd-tray of an older computer. I guess there may be a way to check this before-hand, but I’m not enough of a hardware tech to be able to say.  One can always just resort to downloading the Linux .iso on the target machine itself—from its WINDOWS install—and * then * burning the cd * on * the * target * machine’s * own cd-tray; then shutdown and use the BIOS to boot the cd.  Or just try a Linux USB-thumb—assuming the old computer’s BIOS can boot from its USB connection.  Even if it can’t, we can try the PloP boot-manager method, as I talk about here and elsewhere in my blog.  ]
I will state right-now, before we get even further into this article, that, even as (somewhat of) a Stallmanist (Google that) and a Linux-religionist, I will be candid and honest, and state right here that Ubuntu still has some bugs [the “in-betweener” releases in-partiular, that is; Ubuntu LTS versions (like 8.04 LTS, 10.04 LTS, 12.04 LTS) don’t seem to have many bugs in them], and that some of the less well-known programs that people sometimes use in ms Office may still not have Linux equivalents, and not all of these Office programs can be successfully run in Linux with a compatibility-layer (such as the “Wine for Linux” program).  Ubuntu can be buggy—especially on a laptop, even to-day (2011/2012).  But Ubuntu is one of the easiest to learn to * use *.  Its documentation (online instructions) is among the most reliable and easy to understand.  So a lot of us start with Ubuntu.  I will add that, in all fairness, if you are able to learn to compile and install a custom-kernel for your laptop (not as hard as it sounds), this is often a way of causing most issues with Linux to disappear.  And I’ve just detailed how one can avoid touching the harddrive altogether.  There are other “tweaks”, too, which I shall come-to in the progression of this document.  Read on.
I feel compelled to add right here, however, that you should keep Googling (or “Yahoo-ing”, or AOL-ing, i.e. using other search-engines too)—if you find yourself disappointed with the documentation available from the official web-page of one of the many Ubuntu variants, such as Linux Mint.  You’ll find the answer on the web, and usually in no more time than it would take to satisfy comparable need for instruction in ms WINDOWS.  I recently went to Linux Mint’s official web-pages, with the intent of downloading their instructions for Linux Mint 13 Cinnamon Edition, which I had recently also downloaded—for the purpose of trying-out this build of Linux Mint.
To my chagrin, I discovered that the Mint pages did not offer a .pdf of official instructions for download, like they do for Linux Mint 13 MATE Edition.  Ironic, perhaps, since Linux Mint 13 XFCE Edition is not the build you’d think people would go to first, if dissatisfied with Ubuntu’s new Unity Desktop Environment.  You’d think users would be more likely to try the Cinnamon or MATE builds first, since these are based on the GNOME project.  Or at least that would be * my * logic.  Maybe it has been fixed by now (I made the attempt on 13 March 2013, and again 3 days later).  But regardless, just a little Googling beyond the official web-page would probably turn-up plenty of adequate instructions.  And of course there are built-in instructions in Ubuntu and most variants, via the “Welcome Center”.  And/or clicking on “help” from the menu (or maybe clear everything off the desktop and punch F1, if you’re in Cinnamon).  It’s just that * I * find it more convenient to just download this as a .pdf, and have it available as a separate file.  But I guess more people nowadays will likely choose the “go > to > help” route.  Every user is different.
There are other alternatives to Ubuntu, too.  Like the other two of my “Big Three” (see entry below)—namely A) microKNOPPIX and B) Puppy Linux—especially Wary Puppy L.T.S.  And there are other “noob-friendlies” one can try:  Linux Mint, Linux Mint Fluxbox, perhaps CrunchBang, MEPIS, et al.
Once you get-it-down, Linux is a gift you will have given yourself for the rest of your life.  Once you have learned the basics, and can use a distro on a daily basis, you will almost certainly be capable of keeping-up with the relatively minor changes (progress), that will come over the rest of your adult lifetime.  Remember that even WINDOWS users are going to have to learn new things, just in order to keep-on using WINDOWS.  Microsoft has announced that it will end support for WINDOWS XP sometime in 2014.  And the user-interface in WINDOWS 8 is rumored to be way different, and a learning-curve to negotiate.
Still, I have to say that WINDOWS is probably less demanding—overall.  Unless you get a really bad trojan, or something else really nasty.  Then you will have to do like most of my neighbors seem to do:  If I can’t get their machine back up, they call a repair service.  If their geek can’t at least salvage the people’s personal files, they just develop a sad look for awhile, and a few days later, they bring home a new pc from the local Wal-Mart.  Then they ask me to install an anti-virus on it.  My neighbor Joyce almost had to kiss-off all her wedding-pictuires, which she had stored on WINDOWS electronically.  “Didn’t that upset you?”, I said in a concerned tone.  She did not answer for a long while.  Then finally,  she said, “Well, I still have the ones printed on paper by the photographer.  And he still has the negatives at his business, in the town.”  The situation is about the same for Apple/MAC.  Yes, * any * computer system can be hacked:  there isn’t much of a way to stop a determined hacker—even in a UNIX/Linux-type environment.
But in a UNIX environment—especially Linux and BSD—we can make it * bloody difficult * for somebody to peep-in on us.  That’s why I prefer to re-master Ubuntu or Linux Mint, and then set up a live boot-arrangement:  even with a persistent-save partition configured (and this is optional), these “live Linux filesystems” are very difficult to compromise.  Especially if you re-boot once an hour, when working online.  An attacker’s malware would need some time to actually start to work, and a “live”-type boot of Linux flushes-out all changes as it shuts-down (except for a persistence folder—and you can filter what is allowed to go in there).  This type of boot-arrangement for Linux (“live booting off a vFAT partition”) is unlike a “traditional”-type install to a harddisk partition.
Even if you have a persistent-save partition set up, you are always given the option to boot the system as “live-only” at boot time.  So every time you boot/re-boot a live Linux, it’s just like you re-formatted it—even though you didn’t.  And with hourly re-boots, I’d think it’d be * hard * for an attacker’s malware to get through the live Linux desktop image, and find its way into the persistent-save partition.  The persistent can even be encrypted, too; and you can even just “re-create” it from time to time, if you’re super-paranoid.  [See Appendix A.]
Frankly, just using desktop Linux the “traditional” way—Ubuntu or perhaps one of its variants installed to a harddrive partition—the proverbial “dual-boot arrangement”, where you preserve your ms WINDOWS install, and can boot either that or Linux—it seems this is usually enough for most people.  This “traditional” way of using a Linux desktop seems to resist pretty much all malware and spyware (at least at the time of this writing).
And Linux may get easier still in the coming years.  (I refer of course to ordinary desktop, end-user-use.  The server market already uses Linux.  Probably just over half of all the server-computers in the world today are running Linux, instead of WINDOWS Server Edition.  But again, this is a good deal different from the way that an ordinary person wants to use a system.)
And nowadays the typical end-user increasingly wants to take his or her computer with him, to use at a public library, or at the beach.  The Linux developer community is scrambling to keep-up with this “laptop and mobile-use” trend.  Especially with regard to the newer (at the time of this writing) devices (“hardware platforms”)—netbooks, and more-to-the-point, * tablet-computers *.  (The latter essentially means “i-Pad-like miniature computers”.)
Follow the numbered entries below (begins c. pp 59), and use the crtl + F keyboard command to search these documents, where you are looking for the answer to some question.  It is in this way, I am hopeful, that you will be able to have migrated to desktop Linux, without having done the “work” of migrating to desktop Linux.

Here is an excerpt from a post from Ubuntu Forums “Community Café”, which I have elected to include here.  Please note that this thread was begun several years ago, though I find that most of the sentiments expressed in it still apply.
[Author’s NOTE:  if you’re looking for a “turn-key” desktop Linux system, I frankly like Linux Mint 13 XFCE Edition, or maybe Linux Mint Cinnamon Edition 13.  ]

From:  http://ubuntuforums.org/showthread.php?t=63315&page=42
Old January 22nd, 2007       #415
whitefort
Ubuntu Extra Shot
[whitefort’s Avatar]

Join Date: Jan 2007
Location: N Ireland

Re: Is Ubuntu for You?
This was a great article, if a little painful for me to read – I tend not to post much on Linux boards because I KNOW I’m probably the kind of WINDOWS migrant who makes accomplished Linux users want to bang their heads against the walls and strangle kittens.

I really, REALLY want to like Linux/Ubuntu. I love the whole philosophy behind Open Source. But would it be fair to say that the way Linux is sometimes promoted is itself partly a cause of frustration in those trying to switch from WINDOWS?

Just to take a personal example – I bought and read the ‘Official Ubuntu Book’ before installing Ubuntu.  IF YOU BELIEVE THE BOOK, the whole thing is pretty much plug ‘n’ play.  Ubuntu will easily install, effortlessly detect your network printer, you’ll click a few buttons and your network will be up and running, etc, etc,.

But really, for an awful lot of people, it’s not like that, is it?  It comes down to digging into the innards, changing conf files and so on… Scary stuff, for the average WINDOWS user.

Linux as an OS isn’t like buying a new car that you can just turn the key and drive away in – it’s more like (in some ways) getting a ‘fixer upper’, where you’re going to have to dig into the engine, do some rewiring, etc before you get a vehicle you can drive around in. OK, that’s not necessarily a bad thing, and the person who fixes it up will learn a LOT more about their vehicle – and maybe even have a lot of fun in the process.

But my point is, if he gets the ‘fixer upper’ thinking he’s getting a car he can just drive off in, there’s going to be some frustration, anger, bewilderment… When a Linux distro is promoted as a viable WINDOWS replacement, then the user finds they’re going to learn a heck of a lot more about their PC before they even get some basic things running… well, it’s a recipe for lots of hard feelings on both sides.

This is why I think Aysiu’s article [ http://ubuntuforums.org/showthread.php?t=63315 : remember that this thread was started in 2005; a lot has been improved since then] should be compulsory reading for every WINDOWS user who’s thinking of making the switch.

For myself, Linux/Ubuntu is something of a love/hate relationship at the moment. We’ve considered divorce, but I think we’re going to keep trying to make it work.  Maybe one day I’ll actually find a way to get the network and printer working properly.  At present, it means using WINDOWS for actually getting stuff done and Linux for… learning how to make Linux work.

Sorry – I didn’t mean to say so much – but it was a great article.”

FROM THE SAME THREAD:
Curdsy
First Cup of Ubuntu
Join Date: Feb 2007
Location: The Hills, WA
Beans: 4
Re: Is Ubuntu for You?
Thanks for the rundown. I agree that Linux in general and Ubuntu in particular is worth taking the time to learn. I have run WINDOWS systems for 15 years and was blown away by how good Ubuntu 6.10 is.
AND ALSO:
January 18th, 2009       #576
airjaw
Just Give Me the Beans!

Join Date: Dec 2006
Beans: 74

Re: Is Ubuntu for You?
Ubuntu has overall been a positive experience.
I think I bought into some of the original hype and the reality of ubuntu’s shortcomings and struggles as a desktop have been sobering, but the truth for me has been better than the lie.

I’m enjoying 8.10 and besides the occasional bugs, I have had a positive experience with 8.10 and high hopes for future releases.

Personally, I had to detach myself from the distro and not turn my distro choice into a religious war. This was a personal decision and what worked best for me; for other people, making it their religion may work for them. But for me, now I am free to use Ubuntu for what it is good for, and not feel bad using WinXP for what XP does well. I’m a lot happier this way.

AUTHOR NOTE:  I’m gonna add right here that a real good thread to look at would be:

http://www.economist.com/blogs/babbage/2012/03/desktop-linux

THE BOTTOM LINE HERE:
The conclusion I am come to, sadly, is that Linux * desktops * will probly always be no more than a niche’ market.  I guess time will  tell.  ANDROID—though a form of Linux—is another story:  it seems to have sort of created its own category.  And ANDROID is not open-source (well, mostly not)—though many users frankly don’t care about whether their software is open- or closed-source.  [Not everybody and his brother is a * Stallmanist * –Google that.]  Further, ‘ DROID is not built as a * desktop * operating system, whereas Microsoft WINDOWS and Ubuntu, Fedora, Puppy, and their variants (as well as many other distros of Linux), * are *.  But as I said in the aforementioned text, people may figure-out how to * use * ANDROID for desktop purposes.  They may have done this already, by the time I finally get this paper up-loaded.
Even if that happens, ‘DROID will never be fully open-source:  part of the file-system will always be closed to public scrutiny, and the owner/maker of ‘DROID (the Google corporation) is always going to data-track the user’s activities on ANDROID.  They say so up-front, in the ‘DROID EULA (End User License Agreement).  I have a great deal more trust in Linux Mint.  It is reported that even the mighty Ubuntu is going to begin this kind of data-tracking (in the 12.10 release).  Fine.  Maybe they need the money they think they’re going to make with that, in order to keep developing Ubuntu.  But I’ll bet you a new Ferrari against a donut that the Linux Mint people, and the makers of the rest of the variant-distros will just bleach-out this key-loggering and/or other tracking, and it won’t have much real-world consequences.  The folks at Ubuntu will eventually remove it, and that will be that.
I have been recused as a rather a negative personality anyway.  So perhaps that should be taken into account, where my lack of optimism toward desktop Linux as more than a niche’ market operating system is concerned.  I remain pessimistic, however, as to whether the Linux * desktop * is going to really make it onto a much greater number of our PCs than is currently the case.
Consumers (especially to-day) have little patience with usability-headaches.  And I find that Open Office Writer 3.2—which is what you use in Lucid (Ubuntu 10.04)—seems like  the equivalent of something from 1998, though I hasten to add that this can actually be * better * in several ways than some of the later word-processors—these being somewhat overdone, from the standpoint of an ordinary home-user.  I really should  try to  install Libre Office—I wonder if there’s a version of it that runs nice and stable on Lucid?  [UPDATE:  I am now using Linux Mint XFCE Ed. No. 13, which comes with LibreOffice 3, by default; LO 3 * is * every * bit * as * good * as * Microsoft OFFICE 2007, in my Windows install.  It may be as good as newer versions of ms OFFICE; I wouldn’t know, as OFFICE 2007 is the newest version I’ve installed, so far.]  Open Office 3.2 is nice.  But it has a harder time opening and closing big docs with a lot of formatting (.doc, I emphasize:  I’ve switched to .odt/.odf, since I first wrote this, and the issue essentially went-away in Linux—but now I have it in Windows 7—though I don’t use that much any more).  [UPDATE:  when I upgraded to Linux Mint 13 XFCE Edition, the LibreOffice 3.0 which it comes-with * solved * this issue:  I can now open and use either .doc, .odt, or even .docx without issue.]  Features are a little more cumbersome (to get used-to) in OO Writer 3.2, and some are lacking [where is “toggle case?; “continue numbering” (where you’ve left-off previously) is more involved than in ms Office Word 2007)].
It is also true that I had to fork-out about 130 USD to Microsoft for its Office suite, and it jammed-up one of my big, highly-formatted files in .docx, and I could only recover it by closing it with WINDOWS Task Manager, and then re-opening it (the next day) with ms Works.  Then I had to re-create it by successive copy and paste back into Word ‘07.  And even then I could not recover all the images, so those were just lost (note also that this was when I was still having to use MS’s Vista operating system, as I had not picked up my Win 7 machine from the store yet).  So perhaps it should be said that neither system is perfect.
In either case, had I but been using Dropbox at the time I was creating this file in the first place, I could’ve just gone to my cloud and retrieved a slightly older version, and saved most of the lost images.  I must say I find Dropbox works every bit as well in Ubuntu and Mint as it does in Windows 7.  And I will add that I am a user who has more than 800 + documents of all sorts (my co-major was Divinity Studies)—several running into more than 150 pages, with images and screenshots.
I feel  compelled to repeat here that the Karmic (Ubuntu 9.10) install  to my Mom’s HP Pavillion Slimline tower went-off without a hitch, and everything worked out-of-the-box (except the playlist feature in VLC—I’ve researched the issue—there’s apparently a software bug there:  but like I said earlier, the “in-betweener”, non-LTS versions of Ubuntu are more buggy—and 9.10 Karmic Koala is an “in-betweener”).  She uses it every day, though, even as she knows practically nothing about technology.   And she has not had to call me for support in months.  I need to say, though, that she only uses a computer for very limited purposes—surfing the web, and the few things she has learned how to do.
I suppose I should say as well that “consumer” is an operative word.  Americans (my fellow countrymen/women) comprise arguably a the largest single cohort of world pc usage.  And we tend to think with a “consumerist” mentality.  Perhaps this should not surprise, in a land where corporations seem to dominate.  But it is because of this, I posit, that the very idea of a “Linux Community”—or pretty much * any * community at all (outside of churches & other houses of worship, at least)—the very * idea * of a * Community *—and therefore * community-based software *—is lost on the average American.  This sort of “not-for-profit” approach is  just not our understood way.
There are a couple of pretty good symposia on You Tube, of about 45 Min. each.  These are devoted to * desktop * Linux, its problem-areas, and how it might be “made * completely * great *”.  The links are:


and

Most of the people I know probably just run XP or WINDOWS 7, this having come installed with the new (or pre-owned) machine, and they probly run it from the root (Administrator) and with no password, or with their dog’s name as the password.  They probably use Norton anti-virus, which again came installed, and which basically just lets them know when it needs some more of their money.  They feed Norton with their Visa card when it barks, and perhaps manually delete browser cookies once a day.  WINDOWS firewall runs in the background, and if you ask one of these people about their firewall, their reply is often something like, “What’s a firewall??”
Every spyware-robot program and black-hat hacker in the known universe has been in their system.   But the People either don’t know about this, or they are just “normal”, in that they do normal stuff with themselves, instead of obsessing over the arcana of operating-system tech-savviness.  And in this latter case, they probably cope by just considering the battle already lost, and choosing not to think about their lack of privacy and security.  And there are like 30 viruses and other malwares floating around in the file-system that Norton missed, but which WINDOWS has somehow “learned” to continue to operate with—rather like a man with advanced tuberculosis or syphilis who is too stubborn (or ignorant) to find a doctor who will see him.  Some will also argue that the preponderance of WINDOWS malware to-day is just spyware, adware, or bot-slaving:  and thus it is not designed to do obvious harm—i.e to “crash” your system, or to delete files.  At least not such that it will become noticeable to you any time soon—all the while it is surveiling you, spying on your surfing habits, data-mining your drives, and perhaps bot-netting part of your harddrive, so that a black-hat hacker can use part of your pc’s computing power for nefarious purposes, and all you’re aware of is, “gee, this thing’s running slowly today”.
It is true at once that WINDOWS’ legacy underlying CP/M / DOS file system structure actually * degrades * with ordinary use—on some systems even in spite of regularly-scheduled de-fragmentation.  This seems to apply even to the actual   * system * files *, as well as stuff we add.  This seems to * still * be true, even in WINDOWS 7.  Contrast that against Linux, which is based on a much better, more “professional” underlying paradigm (UNIX); and this does not require de-fragging, nor do files stored to the harddisk degrade all by themselves as time passes.
To be fair, nobody may have envisioned computers bein’ * networked * back in 1975, when CP/M was created.  Nor did people know that CP/M would evolve as the framework for personal computing file systems.  Gary Killdall began it as a personal project, for his own use.  From there, and by a somewhat circuitous route, Gary’s home-made creation found its way to a small start-up company that wrote interpreted programming languages, and this company was by that time known as the Microsoft Corporation, the founders having dropped the hyphen between * Micro * and * soft *.  There’s still a lot of CP/M under the hood, even in WINDOWS 7.
And again, to try to be fair, Microsoft has spent Billions—* Billions *—to try to “close the loopholes”, and remove the kinks.  [And, to update this, Windows 7 now seems to have been rendered remarkably secure—as of the time of this update, or somewhat prior—December 2012:  Microsoft, ** finally **, seems to have successfully closed * most * security-holes in Windows (by which I mean Windows 7—with the updates/patches added).  But really, I have to say that they could’ve probly accomplished this ** years ** ago, if they had truly willed it:  so much of my research indicates it.]  Anyway, you CAN’T make a silk-purse out of a pig’s ear.  This is a major reason for the deprecation and abandonment of XP.  And this same principle might have to do with why Apple started-over with the acquisition of the NexTSteP project c. 1998 (and re-hiring its creator—Apple’s own founder, Steve Jobs), making the NexTSteP Project the ancestor of modern MAC-OS/X.  And NexTSteP was based-on a form of BSD, which is a * nix-type system, and a first-cousin to Linux.  In fact, the relation is so close that many bash-commands that Linux can use will work in OS/X, just as they will in most “*-nix-type” operating systems.
Neither was CP/M Microsoft’s first choice.  They weren’t even in the operating system business at the time.  They were writing (and re-writing) computing * languages *, which operating systems * use *.  MS got pushed-into it by the deal IBM virtually shoved down their throat, when International Business Machine  decided to get into the micro-computing market, around 1980.  So MS didn’t have time to pick-and-choose, and UNIX was not yet free of its foster parent (AT&T), which was just in the process of being broken-up.  No, I don’t wish to make excuses for some corporations’ bullying business-tactics.  But the fact of the matter is that computing power doubles about every * eighteen months *, and this makes the software industry a cut-throat business, whether some of us like this or not.
Microsoft, in many ways, is just the most cunning cannibal in the jungle.  (Or on the tundra, if you like).  And this is just a fact of the world, at least for the present-time—and one which we all ought daily to try and dredge-up the maturity to acknowledge, sad though it is.  A real writer—far above my poor accomplishments—and not content, perhaps, with the status-quo, might add some better words, were he here to observe the situation:  “…or, to take-up arms against a sea of troubles; and, by opposing, end them.”  Yes, I have rendered it out-of-context.  But as I age, I often wonder whether an alternate meaning might not have been implied.
What really “protects” Linux can be put-down to several factors.  At the time of this writing, I find these to be somewhat different, perhaps, than what many have had to say on message-boards, and the like.  Many posters tout Linux’s file-system and from-the-ground-up multi-user layout/permissions setup—Linux being essentially a cut-down version of UNIX, which was/is a “true” multi-user system invented for mainframes.  Well, a lot of this functionality is duplicated mighty well in Windows 7—if UAC is properly employed—and that is not hard to do, even for a fairly pedestrian user.  Yes, there are ways for black-hats to get-around User Account Control.  But I tend to think that the biggest security problem with UAC is that a whole lot of people just turn it off.
You could, for that matter, just run Ubuntu from the Root (Admin) if you wanted; there is info on the web as to how to do this– ** if ** you ** already **** have **** the ** sudo-enabled (primary-user) ** password **.  But why would you want to??  In desktop Linux, everything that is gonna work at all—is gonna work just as well from an ordinary user account, as it would from the Root / Administrator.
The only exceptions would probably be some operations that an ordinary desktop user wouldn’t encounter.  Like certain processes to rescue a web-server, or maybe work on a damaged harddrive.  And of course—as I’ve probably already said—Linux and non-microsoft systems, generally, do not support * a * lot * of the newer and more popular games.  This is even true of Apple/MAC, at the time I write this.  There * are * some nice games in Linux (Battle for Wesnoth comes to mind); but really, * gaming *, dude, is a major reason why I don’t advise people to over-write their ms WINDOWS install:  it’s better to dual-boot.  And there are other reasons, too, as I’ve said.
So I will reiterate:  neither system is perfect.  There * is * no way to stop a determined hacker.
Having said this, to my mind, Linux security (I speak primarily of * desktops *, and of to-day, anyhow) has more to do with things like 1) user-knowledgeability; i.e. desktop Linux comes already installed on comparatively few laptops.  And desktop Linux can be very fiddly to install on a laptop—or even a tower.  SO PEOPLE WHO ARE USING DESKTOP LINUX ARE MORE LIKELY TO BE PEOPLE WHO * CHOSE * LINUX, and who therefore are more knowledgeable—generally—as to personal computing.  They know their way around the file-system a little better (at least) than the average Windows user, or they soon learn to know it.  When (if) an unexpected file appears on the Linux desktop, a user of OpenSuSE, Debian, or Ubuntu is perhaps more likely to be experienced enough to check-it-out thoroughly first, and then probably still not open it (if he or she can’t figure-out how it got there).
2)   Probly just over half of all the world’s servers are Linux-boxes.  Yes, this is owed in-part to the fact that Linux is free (including several distros just aimed at server-use).  But it is also because Linux server distros have * merit *, and perform well.  An ancillary effect of this, however, is that there is a community of Linux server administrators, and it is tight-knit.  If somebody’s Linux server gets exploited, the distributed online community is much more likely to broadcast it to everybody (and how to “patch the vulnerability hole”), via the web.  The server community represents a huge user-base, and its administrators have a vested commercial interest in not getting hacked/malwared/exploited.  And so the benefits of this “collective security” spills over into the desktop arena.  Patches done by administrators of servers can usually be copied freely by the makers of Linux desktops, because Linux (unlike Windows) is FOSS/open-source.  And because the underlying system (for Linux server and for desktop) is much the same anyway, and so much of the underlying codes for each are openly available on the internet (or on Usenet), the makers of a desktop distro can apply a patch that was perhaps originally created for servers—perhaps even within the space of an afternoon.  And it can then be sent-out via Update Manager.
So you can get the update in perhaps a matter of hours, from the time the malware was first reported.  Even if a large chunk of desktop Linux users don’t apply updates, there will likely be enough who so inoculate themselves so as to make it ineffectual for a virus to spread to enough machines to be useful to the virus writer.  Yes, this is a market-share argument (of a kind).
3) Then there are the internal twists-and-turns of the Linux/UNIX file-system itself.  Yes, there are ways to malware a computer without knowing much about the internal file-system (relatively speaking).  But this is still another impediment.  Of course, it’s also (arguably) one of the things that makes desktop Linux harder to learn, because there are very few of those “installer wizards” like the ones seen in the Windows universe (though there * is * a good deal of GUI software install support, in Ubuntu itself:  programs like Archive Manager, Software Center, and G-Debi Installer, so I guess that amounts to the same user-experience, basically—without the added insecurity of using some “installer wizard” from a webpage that might contain DOS-type malware; Software Center and G_Debbie Installer are part of Ubuntu itself—not a “guest program”, like the “installer wizards” in WINDOWS).
Groups of Linux distros use different package managers, there is more than one means of installing a software to desktop Linux [unless, like probly most of us, you’re just going to stick to your default repos—but these are proctored by the distro’s makers, and surveiled by the Linux community—and many more of these persons (arguably) know how to program, and are therefore capable of scrutinizing open-source code, than is the case in the Windows Universe (which is mostly closed-source anyhow, and therefore much less open to scrutiny)].  This makes Linux a harder environment to learn; but at once a harder environment for the virus-writers.  As I may have said elsewhere in this document, one install of ms Windows is a lot more like every other, than is the case with desktop Linux.  So a Windows malware is (arguably) easier to pass from one host to another.
4) From-the-ground-up multi-user system architecture.  Yes, the UNIX permissions-system Linux uses can be accused of being antiquated.  But it is probly harder to defeat by default, than similar protection in ms Windows (UAC), which, by the way, a lot of people just turn-off.  (And UAC is not something Windows possessed historically:  it was added by degrees, its predecessors starting somewhere I guess around the Windows 98SE-era.  So one could argue that Linux/UnNIX has a lot more of experience/development knowledge-base in this area.  Windows, aboriginally, was built to run only from Root, as a single-user-only system.)  Further, I find a great many Windows users are still running normal, everyday use sessions as Root/Administrator, even though user account functionality seems largely perfected in Windows 7.  This often impresses me as a legacy-habit acquired in earlier versions of ms Windows (2000, earlier builds of XP, and of course the notorious Vista).  They don’t still have to be doin’ it that way, but many Windows users persist.  In Linux, by contrast, many of the modern desktop distros hide the Root/Admin account by default, and employ Linux utilities like sudo or su, to ask for your password to do something that normally requires administrator-level permission.  You * can * circumvent this if you want—even in Ubuntu and its variants.  (There’s plenty of information about this on the web).  But this is not my point.  My point here is:  why would an ordinary user * want * to?  Everything in desktop Linux that’s gonna work, is gonna work just as well from user, as from Root (unless maybe we’re talking about some rescueware—like maybe System Rescue CD—but then that’s not ** desktop ** Linux *).  So most ordinary Linux desktop users don’t circumvent the hiding of the Root account.  Many users of Win 7, by contrast, are still running the system as Administrator, and this can have consequences.
5) The major distros do not all use the same permutation of the Linux kernel.  Whereas it would be much harder to use a significantly different permutation of the NT kernel on your Win 7 computer, because it’s hard to modify (because it’s closed-source); and ready-made re-mixes of the ms Windows NT kernel that might be available for download on various file-sharing websites are in fact illegal, because Microsoft Windows is proprietary software.  In the age of magnet-files and cheap web-access in countries whose governments do not support various treaties and conventions to do with IP & the like, maybe this is a moot point, de facto.  The RIAA’s war on p2p seems (to this observer) a lot like the American “war on drugs” waged in the 1980s:  it seems ineffectual.  But anyway, Linux uses the GPL licensing scheme, and there are related licenses (like the BSD license), which make the Linux kernel freely modifiable, and the makers of the major desktop distros use this to their advantage.
6)     Windows has been around for such a long time, and has been so ubiquitous, that it has just been too-juicy-of-a-target.  Yes, Linux has been around a long time, too.  Longer, in fact, if you count its progenitor, UNIX.  But historically, there just hasn’t been as much point in spywaring Linux.  If you Google “Linux Malware”, or “tools to malware a Linux distribution”, I think you’ll find that almost all of what can be downloaded to attack (* desktop *) Linux is either a) just some proof-of-concept created in a CS department, and which probly doesn’t work effectively OOTB; or b) it has been effectively (and quickly) patched, and is therefore no longer a real threat.  If you read the information closely, I’m saying.  [Note again that I speak here of * desktop * Linux—* server-Linux * is another kettle-of-fish, and beyond the scope of these documents.]
Contrast that against poor, long-suffering ms Windows.  What you’ll find is that there are no shortage of freely downloadable “hack-tools” and other nasties that can be used against even a reasonably well configured Windows 7 desktop.  And with good instructions to boot!  Windows has been so put-upon, and has been such a big, juicy target, that a grade-schooler might be able to learn to successfully attack it—even if he (or she) is not a child prodigy.  There’s just a huge knowledge-base online, as to how to exploit Windows.  Linux, on the other hand, can be a juicy target too.  People would love to hack  webservers and get hold of your passwords.  And many bank mainframes use Linux—just because they feel it can be made more secure.  But try searching for suites of free tools to do the job.  I posit that you won’t be able to uncover nearly as much for Linux, as for Windows.
7)Further, where it comes to * desktop * Linux, at least, the file-system can be run “live”, as a compressed and read-only software-stack.  (You can do this with Windows, too; but it’s a * lot * harder.  [It may have gotten easier, by the time I’ve released this writing on the web, for public consumption; but I’d still bet it’s a good deal * harder * to do than just to learn about desktop Linux.]  Trust me—learning to run Windows as a live, read-only/compressed file-system would be a ** lot ** harder to accomplish, and have a fully functional system.  In desktop Linux, there are ready-made ways to run the distro as a P.M.I.  or “frugal” install, from either a harddisk or thumb-drive, or one can just run many of the desktop distros from a cd/DVD, which * is * compressed.
Running as compressed (“live-image”), the root file-system cannot be written-to.  Or it would be very difficult.  Malware would have to install itself to other locations, such as (perhaps) the ramdisk, or possibly (according to some people) to the computer’s BIOS firmware.  Well, changes made in RAM are temporary:  they disappear with a re-boot—or else can only be saved if a persistence folder has already been configured and the OP allows the changes to be written there.  The best way to do this arrangement, IMHO, is to keep trying live-cds/live USB-keys you can easily make with various (free GUI) programs now available on the web, until you find the best distro for you.  Then install it to your harddisk, and use it from there for a while, adding the programs and settings you like, and getting to know it even better.  When you reach a satiation-point, learn to install and use Remastersys (or the remastering-app appropriate for your distro—this is not hard, and it’s all GUI).
Then run the remaster-er, and make yourself your own .iso file of your customized desktop Linux. Use a folder you’ve created ahead of time in a USB thumb-key as the storage destination for the .iso.  Then, install the remaster as a read-only “frugal install”, to a FAT-32 partition on your harddrive.  Or use it from a USB thumb-key, if you prefer.  See Appendix A of this document.
8) The feeding of patches to the updates managers in the major Linux distros is a * lot * more prompt than its Windows counterpart—the feeding of Windows Updater.  For whatever reason, by the time good ol’ Microsoft gets-around to sending you a patch to fix a vulnerability, you might have already gotten the malware.  The Linux community, on the other hand, has a track-record of much greater vigilance.  Not ** perfection **.  But *** much *** greater vigilance.
9) Finally, Linux (especially the desktops) just seems to be “let alone”.  For whatever reasons.  And one can say about the same for Apple/MAC.  As an aside, I’ll add (as I’ve stated elsewhere), that Apple/MAC (OSX) is really a UNIX/POSIX-type system too, just as is Linux—whatever bearing you care to ascribe (or not to ascribe) to this fact, it is a fact.
I’m gonna add that the most vulnerable piece of * any * os running to-day—at the time of this writing—has gotta be the * web browser *.  Number two would be the ** user **.  Or perhaps it’s in the inverse order.

Eventually, the malware collected by WINDOWS—just from being connected to the internet—can no longer be held in-check, and the guts rot-out from the inside.  Or a particularly nasty virus just cripples it (I refer here mostly to XP—WINDOWS 7 seems to have been fixed-up a lot—at least at the time I add this particular sentence—January 2013.
When this happens a geek is sometimes sent-for, and she/he may be able to give XP another lease on life.  But just as often, it is  assumed by the machine’s owner(s) that it is “done”, and, when some of the household’s monthly bills are able to be put-off for a few weeks, the same Visa card takes a day-trip to the local Wal-Mart, and the card-holder returns with a new (cheapie) pc.
And the cycle starts again.
It is further the case that a substantial number of SOHO-users (Small Office/Home Office) have a considerable investment in their various WINDOWS softwares, and  will probably continue the pattern of “prudently upgrading” to successive versions of WINDOWS in order to “preserve their investment”.  It’s rather a vicious cycle.
A lot of Linux users on forums, I’ve noticed, seem to be people who made the switch because their WINDOWS ME or VISTA blew itself to pieces right in front of them.  Or they were one of those lucky people who got some particularly nasty virus in XP.
And I’d have to wonder what it’d be like to use desktop Linux for running a mid-sized business. There are a lot of people who do this, actually.  But you can bet your sweet bippy that 1) they have their install nailed-down, such that any relevant bugs associated with the compatibility of their hardware were long-ago solved—by themselves, by “luck”, or by a geek who knew; and 2) they are probably just usin’ it to operate the business, and only for that limited function-set.  Nobody is tryin’ to build stuff on Second Life with it, while they’re supposed to be workin’.
Desktop Linux continues to develop, however, and so may be better 6 months from now than it is currently.  That is one of the good things about technology generally.  But it seems to apply to Linux particularly—a few areas excepted.  So there is Hope.
If you want a further break-down of desktop Linux’s remaining difficulties (and some possible solutions), you might wanna check-out entry No. 80, near the end of this doc.  Note that it is not a complete list.  And this entry is aimed at my fellow * desktop * users, and not at developers/programmers, who can (and have) draw/drawn -up lists of their own, as to Linux strengths and weaknesses—as to the realm of developing and coding.
Having said all this, I will add that I am stickin’ with Linux, at least for my own systems.  I might even try to do some guerilla- proselytizing—like burning some live cds and leaving them laying around in strategic places, like the public library.  I could draw a smiley on each one with a magic marker, and write something compelling, like maybe “Are you a geek?  Take this free Linux cd, and find out.”  Perhaps with the sub-heading:  “Take me home, I’m free”.
I’d  better not get caught, though—they’d probably pull my library card if somebody successfully booted Linux on one of the lib. computers because of me.

This is a list of the very basic steps.  If you need more in-depth coverage, see other files in this data-base.
Others will post here, so be sure and read the comments-threads below.
CONDENSATION and TABLE OF CONTENTS:
Here is an enumeration of the steps described below, entry and number:
1. Entry 1:  PATIENCE AND A LITTLE OF A “PIONEER-SPIRIT” IS STILL NECESSARY
2. Entry 2:  A BIGGER PIPE CARRIES MORE WATER, AND THERE’S NOT MUCH WE MERE MORTALS CAN DO TO CHANGE THIS
3. Entry 3:  DON’T GET MAD AT ME, I’M JUST TRYIN’ TO HELP
4. Entry 4:  I DON’T EAT MUCH FRUIT
5. Entry 5:  GET A CURRENT LINUX FROM THE WEB—JUST NOT THE VERY LATEST ONES
6. Entry 6:  What is 32-bit versus 64-bit operating system?
7. Entry 7:  IF YOU’RE A GAMER, BEST TO INVEST IN AN X-BOX.  Or just don’t over-write your WINDOWS install.  There are other reasons to preserve our WINDOWS install.  I have described methods (below) to install Linux desktop, without harming WINDOWS.
8. Entry 8:  TouchPAD Functionality in Linux
9. Entry 9:  TOUCH SCREEN Functionality in Linux
10. Entry 10:  LINUX, HISTORICALLY, HAS NOT BEEN AS BIG ON “EYE-CANDY” and visual-effects, BUT NOW IT HAS A LOT OF THIS.  (IF YOU WANT IT.)
11. Entry 11:  THE THREE STEPS TO USING LINUX
12. Entry 12:  CONSIDER THAT LINUX HAS CONSIDERABLE DIFFICULTY WITH SOME PERIPHERAL DEVICES……  Entry 12a:  SOUND CARD SUPPORT IS STILL NOT WHAT IT SHOULD BE IN DESKTOP LINUX
13.  Entry 13;  BACK-UP YOUR DATA:  Make backup copies of any files you don’t wanna lose, before you do anything else.  This is proper procedure anyway.
14. Entry 14:  CONSIDER THAT LINUX HAS CONSIDERABLE DIFFICULTY WITH SOME PRINTERS:
15. Entry 15:  Know right now, that due to the way modern microcomputers are manufactured, there are a huge number of motherboard configurations and combinations possible….
16. Entry 16:   “RIVER” CAN BE BETTER THAN “POD”:
17. Entry 17:  START USING THE MAJOR LINUX APPS WHILE YOU’RE STILL ON WINDOWS:
18. Entry 18:  TAKE COMFORT IN THE SIMILARITIES:
19. Entry 19:  BROWSING FROM LINUX:
20. Entry 20:  PLAYING A DVD
21. Entry 21:  DON’T KILL YOUR WINDOWS!
22. Entry 22:  CHOOSE FILES-FORMATS THAT WILL WORK ON BOTH PLATFORMS…well, really, this doesn’t seem to apply anymore:  even .docx is now supported in desktop Linux.  So this entry is probably archaic—but it might still be of use to people who need to run some older release of desktop Linux.
23. Entry 23:  LET YOUR HEAD—OR AT LEAST PART OF IT—BE IN THE CLOUDS
24. Entry 24:  Google EARTH
25. Entry 25:  ALPHA AND BETA:
26. Entry 26:  A WORD ABOUT DESKTOPS AND CHOICES:
27. Entry 27:  OTHER “CONFUSABLES”:
28. Entry 28:  MY “BIG THREE”:
29. Entry 29:  HOW GOOD IS YOUR MEMORY?
30. Entry 30:  NVIDIA CARDS:
31. Entry 31:  HOLY KALEIDOSCOPE BATMAN!  I JUST BOOTED MY LINUX AFTER INSTALLING IT TO THE INTERNAL HARDDRIVE, AND NOW THE SCREEN IS UNREADABLE!!
32. Entry 32:  Most Linux distros have a Wikipedia article about them, which is worth at least a perusal.
33. Entry 33:  At least have read-through the release-notes of the particular distro—especially if you decide to use it on a permanent basis.
34. Entry 34:  LINUX STELLAR PRIVACY AND SECURITY:
35. Entry 35:  OTHER CONTENDERS (for noob-friendliness)
36. Entry 36:  TRY A FEW DISTROS AS LIVE-CDS, TO FIND THE ONE THAT’S RIGHT FOR YOU AND YOUR MACHINE:
37. Entry 37:  USE “LOCAL MIRRORS”:
38. Entry 38:  USE THE CHECKSUM:
39. Entry 39:  BURN ON SLOWEST SPEED:
40. Entry 40:  TRY THIS ONE FIRST, IF ONLY TO GET YOUR FEET WET:
41. Entry 41:  DON’T GIVE-UP:
42. Entry 42:  IF THE BROWSER WORKS…
43. Entry 43:  USE BACKUP MEDIA:
44. Entry 44:  DON’T READILY USE “COMPUTER JANITOR”, OR EQUIVALENT PROGRAMS.
45. Entry 45:  LAPTOPS AND MOBILE DEVICES ARE A BIT OF A DIFFERENT CONCERN, WHERE IT COMES TO RUNNING LINUX ON THEM:
46. Entry 46:  UPDATE YOUR SYSTEM BEFORE YOU INSTALL A SOFTWARE:
47. Entry 47:  FLASH VIDEO AND MULTIMEDIA:
48. Entry 48:  I’M NOT SURE WHAT AN “OGG” IS, BUT I DON’T LIKE THE SOUND OF IT.
49. Entry 49:  “RIVER” CAN BE BETTER THAN “POD”:
50. Entry 50:  USE A POST-INSTALL CHECKLIST:
51. Entry 51:  WHAT THE HECK IS THIS “ Command-Line/Console/TERMINAL” thing, AND HOW DO I GET TO IT IF I NEED TO?
52. Entry 52:  STICK TO YOUR DISTRO’S REPOSITORIES, IF YOU WANT TO ADD A SOFTWARE:
53. Entry 53:  KNOW HOW TO SHUTDOWN LINUX:
54. Entry 54:  Most of your major keyboard commands that you are familiar with from WINDOWS work just the same way in most Linux
55. Entry 55:  SUPPORT:  THERE’S LIVE HELP:
56. Entry 56:  SUPPORT:  You can obtain help for almost any particular program or distro in the IRC server irc.freenode.net
57. Entry 57:  SUPPORT:  EVERY MAJOR LINUX DISTRO HAS A USER-MANUAL. These can be found online.
58. Entry 58:  SUPPORT:  LINUX MAN-PAGES:
59. Entry 59:  SUPPORT:  FORUMS ARE VERY USEFUL:
60. Entry 60:  SUPPORT:  UBUNTU SUPPORT RESOURCES
61. Entry 61:  SUPPORT:  Cannonical, Ltd., the company that develops Ubuntu, has pay-for support plans available
62. Entry 62:  COPYING AND PASTING INTO AND OUT-OF A TERMINAL WINDOW:
63. Entry 63:  NO DEFRAGGING NEEDED.
64. Entry 64:  ABIWORD IS A FINE PROGRAM, BUT…
65. Entry 65:  A CLIPBOARD WORK-AROUND:
66. Entry 66:  FILENAMES:
67. Entry 67:  SAFE MODE AND SYSTEM-RESTORE ARE NOT NEEDED IN LINUX
68. Entry 68:  LINUX COUNTS FROM ZERO, NOT FROM ONE:
69. Entry 69:  LINUX IS CASE-SENSITIVE.
70. Entry 70:  INSTALLING LINUX:
71. Entry 71:  IT KEEPS CRASHING?
72. Entry 72:  WHAT IS “BORK MY INSTALL”?
73. Entry 73:  LINUX HAS COME WITH NTFS SUPPORT FOR SOME TIME NOW:
74. Entry 74:  WHAT THE HECK IS “HOME DIRECTORY”?
75. Entry 75:  OKAY, BUDDY, SO SUPPOSE I WANT TO HAVE GOOD LINUX SYSTEM SECURITY, BUT I DON’T WANT ME OR MY GIRLFRIEND TO HAVE TO TYPE-IN THAT TEDIOUS, HARD PASSOWORD, JUST TO LOG-ON TO PLAY A GAME, AND LIVE-CHAT WITH AUNT LINDA ??
76. Entry 76:  BACKUP ON LINUX:
77. Entry 77:  HUNG WINDOW?  FROZEN APP?
78. Entry 78:  In Linux, there are often WAY* S * to do things, rather than ** the ** way to do something
79. Entry 79:  FILES-SYSTEM AND FILES-FORMATTING:
80. Entry 80:  I HAVE SOME FILES STORED ON A USB THUMBSTICK. HOW DO I OPEN (MOUNT) THIS THUMB-KEY, AND HOW DO I SAFELY REMOVE IT?
81. Entry 81:  MISCELLANEOUS ISSUES WITH DESKTOP LINUX:
APPENDIX A   c. pp 212

——————————————————————————
1. Entry 1:  PATIENCE AND A LITTLE OF A “PIONEER-SPIRIT” IS STILL NECESSARY:    Be willing to have patience and invest some time.  Once you have learned the basics, and can use a distro on a daily basis, you will almost certainly be capable of keeping-up with the relatively minor changes (progress), that will come over the rest of your adult lifetime.  It is good to know your way around both platforms, if for no other reason than a lot of your WINDOWS rescue-discs nowadays are Linux-based.  Remember that even WINDOWS users are going to have to learn new things, just in order to keep-on using WINDOWS.  Microsoft has announced that it will end support for WINDOWS XP sometime in 2014.  And WINDOWS will probably NEVER be made as secure as Linux can be.  (As secure as a Linux install *can * be * made * to * be*, I am saying.)  Be willing to Google for some info.  Linux is more work, but it is far more secure (and can be made even more-so), and you get it for FREE.
2. Entry 2:  A BIGGER PIPE CARRIES MORE WATER, AND THERE’S NOT MUCH WE MERE MORTALS CAN DO TO CHANGE THIS:    If you don’t have broadband (DSL) or its equivalent, this is gonna be a lot harder, in the long-run; (probably also in the short-run, too); and your practical options will be limited.
3. Entry 3:  DON’T GET MAD AT ME, I’M JUST TRYIN’ TO HELP:    I have tried to make this blog a series of “shortcuts to Linux”, so you don’t have to repeat my hardships or the errors I made from ignorance, or duplicate all of the resaerch I had to do, to migrate to Linux myself.  As with * any * Linux advice you receive from the web or any other source, you should not trust just one source.  Google a little, and confirm any action you are about to undertake with at least two different sources.  This is also good practice, by the way, in Microsoft WINDOWS.  Linux is * free * software, which you choose to install or not to install:  it offers no warranties, implied or of service, or otherwise.  But a reasonably responsible person that is capable of reading a manual and abiding by its instructions (though I’d still confirm with another source), and who backs-up his or her data properly before executing any serious operation  should have little chance of catastrophe, I’d think.
4. Entry 4:  I DON’T EAT MUCH FRUIT:  I undertake to write this with the assumption, as I indicate variously in the text, that you are coming to Linux from a WINDOWS environment.  If you are coming from a MAC environment, what I am able to offer here may not be of nearly as much value, as I don’t know much about the MAC platform.  Sorry.
5. Entry 5:  GET A CURRENT LINUX FROM THE WEB—JUST NOT THE VERY LATEST ONES: Current, contemporary versions of desktop Linux are even more graphical (point-and-click), user-friendly, and newbie-friendly than older versions, some of which are still available for download.  Just don’t get the very newest, “bleeding-edge” Linux, like the latest release of Fedora, or some Ubuntu that’s still “in beta” (or in alpha).  The Ubuntu people release their long term support (LTS) version every two years, and it is then updates-supported for the next three years (Ubuntu 12.04 for five), overlapping the next “LTS” release.  UPDATE:  LTS-versions are now supported for * FIVE * years, as of the release of v. 12.04, in spring of 2012.  And I feel the current one (whatever that is by the time you’re reading this) is a good place to start.  The Puppy Linux 5x live-cd series is also current, at the time I’m writing this, and is good for absolute beginners.  Be sure and read the comments thread below this article, as others will post with suggestions.
Don’t think with the logic “an older one will be simpler, because my aunt Janet said that she had a lot of trouble learning to use WINDOWS 7, after having used XP for all those years”.  Newer Linux are usually easier to use.  It’s more a matter of finding a distro (desktop Linux distribution) that doesn’t require much fiddling (or any) to recognize your hardware stuff (sound card, networking, mouse, &tc.).  And then it has to be a distribution that is easy enough for you to run.
6. Entry 6:  What is 32-bit versus 64-bit operating system?
Be aware that there are now 64-bit CPUs out there.  This has been so for some time now.  Your computer may have one.  Frankly, I just use the 32-bit versions of Linux, even on my good lappy here, which has 64-bit.  It’s math:  you can run 32-bit Linux and its associated packages just fine on a 64-bit hardware; but I don’t think you can use a 64-bit version on a hardware that only supports the 32-bit address bar.  There might be a way.  Linux is capable of some amazing stuff.  But as for us as noobs, why in the world would we want to do that, even if it was possible?  I’m lucky I barely know enough to turn this sucker on in the morning, and use it for what I need.  I’m not lookin’ to try anything as weird as running a 64-bit Linux on a 32-bit computer.
Further, I am told there is a wee-bit more fiddling, to get some apps working on a 64-bit install.
So my conclusion is that you start-out with just the 32-bit version of a distro.  Some distros still don’t offer a 64-bit build anyway, even at the time of this writing (October 2011).  And some Linux distros do offer a 64-bit version.
Try live cds of 32-bit builds, and if none of my “Big Three” (see entry 28) seem to work for you, start downloading and burning 64-bit ones.
NOTE that any live Linux disc (cd or DVD) is gonna be  S – L – O – W – E – R  than when/and/if Linux is installed to and running from some other media, like an internal harddrive, or an external/add-on harddrive, or even a USB thumb-stick.  Slower to boot-up, and slower in running.  Not having enough hardware resources/”computing-‘horsepower” can also cause Linux that is being tried from a live cd to be slow, or to “hang”.  This is just something you have to suss-out.  Puppy Linux, however, is rather an exception to this, because Puppy is one of the comparatively few distros that is small enough to have ALL of the File System loaded into the computer’s RAM as the boot process finishes—and Puppy is pre-configured to do so.  It is because of this feature that Puppy Linux will run just as fast from a live-cd as from any other boot-media.  There are other distros that are like Puppy in this way, but they are comparatively few.  KNOPPIX also—though it does not load all of its FS into the RAM—is somewhat exceptional in this regard, as it normally runs as fast from a live-media, and the creators discourage harddrive installation anyway, because KNOPPIX is made to run from a disc or a thumb.  A VERY IMPORTANT NOTE/UPDATE TO TACK-ON HERE, IS THAT MOST USERS (by the time I’ve had a chance to insert this) ARE NOW PROBABLY SKIPPING THE * BURN * THE *CD * PART, BECAUSE THE USE OF WINDOWS-BASED PROGRAMS LIKE UNIVERSAL USB CREATOR from pendrivelinux.com, et al (free!), ARE INCREASINGLY PERFECTED, AND SO YOU CAN TRY-OUT DESKTOP LINUX FROM A BOOTABLE USB KEY YOU MAKE IN ABOUT 15 MINUTES, AND IT WILL RUN AT VERY NEAR FULL-SPEED!  This is now a * better * way to test-run desktop Linux, and it will usually install just fine from the USB-drive, if you decide to do so.  Further, it usually won’t hurt the USB-thumb drive (“pen-drive/jump-drive”), and you could re-use it later, for other stuff, just by re-formatting it back to fat32.  One caution about this method, though, is that you do run (some) risk of formatting the Linux distro to the wrong drive, if you’re   a) really inexperienced, or   b) really tired.  So be careful!  Or just stick to trying desktop Linux from a cd/DVD.  CDs and DVDs are still good, though, to make backup-copies of all your text, documents, batch, audio and video files (and your operating system(s) too), once you’ve installed the systems and programs you want to your disk.  Always make backup-copies of your Windows operating system on cds/DVDs as soon as you get a new (or used) computer, and then once again before you mess with Linux, BSD, or any other system.
7. Entry 7:  IF YOU’RE A GAMER, BEST TO INVEST IN AN X-BOX; Or just use your Windows partition for playing.  I think you should be told now, sooner rather than later, that a great many of your WINDOWS-based games—probably at least a third of them, and especially the newer ones, just are not gonna run in Linux—even with a compatibility-layer like the Wine program.  Some of the rest will have an excellent equivalent in Linux.  And others won’t.  I am not a gamer, so this is not important for my own use.  But it is important for many people.  Linux comes with a great many of its own games, but these of course are not identical to those from the WINDOWS metaverse.  It is also true that many games people like are able to run just fine in Linux.  And more of this functionality is being added all the time.  What I think I would do would be to just invest in an X-Box for gaming, and then use Linux for everything else.  But you are not me, and you may not wish to do it this way, perhaps for reasons which are beyond the scope of this blog. There are other ways, ways to run Linux and still be able to use one’s favorite MMORPEG or other games.
UPDATE:  Just Googling-around lately, and in my curiosity, I seem to have found quite a number of free (and some pay-for) games for Linux—many of ’em 3-D.  I’ve only had time to download and try a couple so far.  It looks like the open-source programmer community has been busy.  Most of these games don’t look at all cheesy, and there are a huge number of ’em on the web that say they will run in Linux—or are built just for Linux.  I’ll try to update this again, when I get more time.
8. Entry 8:  TouchPAD Functionality in Linux.
What can I say—I just am not that big a fan of touchpads/trackpads on Laptops, whether I am using WINDOWS or not.  This is a feature I usually turn-off, and use a USB mouse..  I haven’t seen that many that really worked as intended.  This, of course, may just have to do with the facts that 1) I’m always broke, and 2) I don’t have money.  And so the circles I move in tend toward the lower-end of the hardware market (?).  I will add that while the touchpad never did work right on my good lappy here (a nice, mid-range Satellite with 3G RAM and Pent T4400 dual-core, with the WINDOWS 7 Home Prem that came installed with it):  it went to working just beautifully in Ubuntu 10.04, as soon as I got up the gumption (and a few minutes spare time) to edit the appropriate Grub file to ACPI=OFF.  The touchpad’s performance is unchanged, however, in my WINDOWS partition—it still sucks, which is why I still use a USB mouse if I’m in WINDOWS.  Yeah, there are wireless mice, too.  Those seem popular right now.  It’s your preference.
I prefer a USB mouse, and one of the popular (and relatively cheap) brands, like a Logitech.  Ditto that for USB keyboards, webcams.  I’m a privacy-nut, which is a big reason for my interest in desktop Linux.  And therefore I don’t like broadcasting my mouse and keyboard actions in RF—or IR, for that matter.
You might wanna ask yourself:  do I really, * really * need a touchpad?  Isn’t a USB mouse good enough, if I really want the security benefits (and other benefits) Linux can offer me?
You might also check-out the link:  https://help.ubuntu.com/community/SynapticsTouchpad .  But I’d try my Fn key-combinations first if I were you.  Sometimes it’s just that simple.  Look in your computer’s owner information, or find on the web a list of your machine’s Fn key combinations and what they do.  I’d especially look for Fn + F7, because it is often that one which toggles your TouchPad on and off.  Go to your manufacturer’s website.  Often times, in WINDOWS, Control Panel > System > Device Manager > dbbl-click on the item you think it is, will give you some details as to who built your TouchPad/TrackPad, and what driver it uses.
My good lappy here has a Synaptic brand touchpad (not to be confused with the excellent open-source package manager that is available in Ubuntu and several other distros).  I tried my Fn keys, and I tried changing settings in System > Preferences > Mouse > the Touchpad tab.  As these were ineffectual, I took the next step in the above link, and installed Touchpad Indicator.  This worked on my system without a hitch, and it is still working.  I notice the top-panel icon for Touchpad Indicator does not appear across re-boots of the computer; but then, it probly is not supposed to.  I haven’t tried, but I’d bet I could just invoke it again from the menus, if I wanted to change a setting.  IF YOU Google “TOUCHPAD UBUNTU”, or “touchpad issues desktop Linux”, &tc., there is a huge amount of documentation and support—especially for the Synaptic-brand touchpads.
I will add that most of the major desktop Linux distros have a Good reputation for supporting the track-stick/”eraser-head” thing that works the mouse on the ** Thinkpad ** type of laptop computer:  and this seems to be true for most of Thinkpad’s model-lines.
9. Entry 9:  TOUCH SCREEN Functionality in Linux:
Touch-screen functionality in Linux has been around for a long time, and this is also true of pretty much every other major family of operating system.  More recent, advanced features, however (such as gestures), may still be a bit “green” where it comes to Linux.  There are numerous free drivers for Linux distros, however, that will at least add touch functionality, if your distro did not already come with it.  Check the distro’s Home Page.
So if you intend to use Linux on a tablet, you might need to have a USB port.  But you probably won’t—at least not for plugging-in a USB keyboard.  Still, as for myself, I wouln’t want one of the earlier tablets, which were manufactured without a USB hardware port.  A USB port is handy, and for so many things (especially for somebody not too apt with technology—such as myself).  Research the issue BEFORE you buy, if at all possible, so you know whether the tablet is a good fit for Linux use.
An excellent page pertaining to touch-screen in Linux on tablets is found at http://www.innovationsts.com/blog/?p=2959 .  This has to do with the Bodhi Linux distribution, an Ubuntu-based distro which I have not tried myself (so far, at least).  I cannot recall having heard anything much in the way of negatives, where it comes to Bodhi.  This distro is still rather a work-in-progress, at least at the time I write this.  However a lot of people use it.  As with all Linux software, it carries the disclaimer that it is “experimental software”, and that you “use it at your own risk”.
Another couple of distros to check-out for use on tablets might be PeppermintOS, or maybe EasyPeasy Linux.  Really, most desktop Linux will at least theoretically run on such devices (theoretically at least).  Check the system requirements.  Most current working versions of desktop Linux have the capability built-in to auto-adjust to a device of pretty much any screen size.  (Or screen resolution can be manipulated for future boots by adjusting certain settings; so really, any distro nowadays will work on pretty much any size screen.)  Your main compatibility issues are likely to be with RAM and CPU.  But you can never really tell until you install it.  This makes some multiple use-session testing with a live cd imperative.  Do some stuff with it from a live-cd, before you pick a distro.
UPDATE:  Give Ubuntu’s new Unity Desktop Environment a try.  It seems now to be out of the beta-testing phase, and into full release.
10. Entry 10:  LINUX, HISTORICALLY, HAS NOT BEEN AS BIG ON “EYE-CANDY” and visual-effects, BUT NOW IT HAS A LOT OF THIS.  (IF YOU WANT IT.)   Contemporary desktop Linux has been fitted with much of this functionality.  Still, don’t expect the exact same things to be available on Linux, as in, say, WINDOWS.  Nor vice-versa.  My install of microKNOPPIX 6.4.4 which I can run from a USB thumb came with Compiz, and it works flawlessly.
11. Entry 11:  THE THREE STEPS TO USING LINUX are (stated loosely):  1) I gotta find a distro I can successfully boot on my machine; 2) it has to be easy enough for me to actually be able to use, once I can successfully boot-it-up; 3) it has to offer enough user-support, in terms of user-manuals (“documentation”, usually online), forums, and other people on forums who have actually heard of the distro, and more-to-the-point, actually use it—so somebody can help me solve a problem if I have one.
The trouble with Linux documentation is not so much scarcity as that it is scattered.  It can take a long time to find a webpage that answers the exact problem you are having—especially if you don’t know much about computers.  It depends on what it is.  Information about most rudimentary questions is readily available.  See the entries in the tip-list (below) that pertain to support.  I have tried to post my experience here—such as it is—so that you don’t have to duplicate it all.  Even so, you’d better not rely on me as a source; research anything you do, and confirm it with more than one source.  This is a wise policy on WINDOWS, and so the more in Linux.
Find out all you can about your hardware.  Boot your computer from a cold-start, and go into the BIOS menu screen.  Your mouse won’t probably work in there, but you can navigate by using the arrow keys, and hitting Enter to confirm something.  Take care not to make any unintended changes, although most BIOS make it difficult to make a change to BIOS settings by accident.
The BIOS menu is usually reached by tapping the DEL, F1 or F2 key with your finger (at more-or-less one second intervals), early-on in the boot process (as almost imediately after pushing the power-switch, to turn the computer on).  Which key it is is usually indicated by the first or second boot screen, usually along the bottom.  This screen usually displays the name of the computer’s manufacturer, and often the logo as well. Often this screen is only displayed for a split-second or so.  On many machines you can hit the Pause/Break key, and this will pause the boot process long enough for you to scrutinize the screen. Hitting pause/break again will usually cause the boot to continue.
On some machines, it is a key other than the ones I just named.

The F- key that leads to the BIOS menu is usually (but not always) of lower numerical value (I.e. F1, F2, F3, F4, F5);  the key you tap to go to the BIOS boot-screen (from which you actually BIOS-boot) is usually of higher numerical value—F12, F11, F10, &tc.  On some HP computers, it is F9.  Where it comes to a lot of modern equipment (i.e. made after about 2008), it is gonna be F12.  But really, this seems pretty scatter-shot, from one make/model to another.  On some machines it can also be other keys, like DEL, ESC, &tc.  Sometimes you can hit the pause/break key to temporarily stop the boot process at this screen, and then tap that key again to let it continue.  This boot screen may flash on your monitor for only about ½ second—or perhaps not at all.  With some [mostly * older *] machines, the default BIOS settings don’t give you much of a time-window for hitting the stoopid key(s) to take you to the boot-menu screen.  And with some of ’em, you also gotta be attempting to hit the right key * at * just * the * right * time, while the bloody machine is booting-up.  Hallelujah.  So some of the BIOS out there are * tricky *:  this is mostly on older or cheap machines—but you never really know in advance, no matter what kind of info you think you have from stickers on the computer’s case, etc. You might consult your manufacturer’s home-page on the internet for information about adjusting your machine’s BIOS settings, if you can’t find the right keys to tap during boot-up.  On some really cheap machines, you’re sort of on-your-own—you may just have to experiment.

Navigate around in there, and make notes about your BIOS version and manufacurer, your CPU make and model, whether or not it is single-core, dual, or quad-core, and other CPU specs, like the clock-speed.  Make notes of these facts, because even though you probly won’t need many of them to start using modern desktop Linux, it is good to write them down while you have the opportunity, just in case.  Note the size of the RAM, and any other specs that are available.  There is nothing like getting this information straight from the computer’s BIOS, because this is the most likely to be accurate.  When you are done in there, you should be able to find some button or tab that will let you “exit” the BIOS, at which point the machine will probly go ahead and boot into WINDOWS.  Once in there, you should be able to go to Start > Control Panel > System, and check this information against what is displayed there.  It will be a lesser amount of information, probably, less specific, less detailed, perhaps, than what was offered in BIOS, as to your hardware specs.  There are other ways, too, to find-out about your hardware specs.  A good way is to just run the Ubuntu live-cd, and open “Hard Info” from the menus.
If you go to the homepage of any Linux distro (or maybe just read the Wikipedia article), there will be information as to the hardware requirements for using the distro—particularly as to RAM/”memory”, and CPU specs.  These are brief and easy to digest.  You probably don’t have to learn anything about the core-structure of your CPU, or whether your computer’s RAM chips are DIMM or SO-DIMM; just learn who made the CPU (Intel? AMD?), and what its clock-speed is, in Giga-hertz (Ghz).  Know how many Giga-bytes or Mega-bytes of RAM your computer has installed.  These two specs may be about all you’ll ever need.  But you should be mindful also of that GPU (Graphics Processing Unit, a.k.a. Your “Graphics Card”), and the machine’s wireless-card (if not a tower computer), and the network-controller.  And any other such stuff.  To be on the safe-side, while you’re in there, write-down any-and-all other specs you can find.  It will only take a few minutes, and you can type this stuff into  a Word document later, and save it, for convenience sake.
You can also learn a lot about your machine’s hardware (“hardware configuration”) by booting most builds of Puppy Linux and opening “Hard Info” from menus (or run it from any live Linux cd that has this utility installed—like KNOPPIX), and then you can just click Hard Info’s print icon to print the results of what it found—assuming it recognizes your printer.  If all you have at your disposal is a Lexmark printer, you may need some luck:  Lexmark support used to be almost hopeless in Linux; however, in recent months (coinciding with the release of Ubuntu 12.04 Precise Pangolin), Lexmark support has been gaining some ground.  As with all things related to PC (Personal Computing), the situation is “dynamic” (in other words, is always in flux, and changing all the time).  Especially when it comes to ** Linux ** Personal Computing.  It’s the nature of the beast.  We all enjoyed having the “stability” of Windows XP for about 12 ** years **.  But this was an aberration, an exception.  We have only had Windows 7 with us for about ** four ** years, and already Microsoft is poised to push Win 8 on everybody.  And the interface is rumored to be significantly different.  So PC is in more-or-less constant flux, even in the Windows universe.
Be aware that pretty much any Linux distro has 1) a * minimum * hardware requirement, and 2) a * Recommended * hardware requirement.  As I have tried to indicate elsewhere in this document, Linux is “famous” for its ability to run on older computers with comparatively feeble hardware resources.     But gaging from my own experience, and given how reasonably laptops and towers of recent vintage can be had on e-bay these days, I really think it just makes more sense to try and match (or preferably exceed) the *recommended * hardware requirements for any of the distros you’re likely to be interested-in.  I guess a simple way would just be to find the recommended requirements for Linux Mint with the KDE desktop, and use that as a bench-mark.  Because this is one of the higher mid-level configurations of desktop Linux, in terms of hardware resource consumption.  If a machine will run Linux Mint with the main-line version of the KDE desktop environment, it will probly run * any * desktop Linux.  Learn to navigate-around the Linux Mint site:  play-around with it, from your WINDOWS computer.  It’s easier than Facebook.
You should understand also that if your hardware is marginal (i.e. it just barely exceeds the recommended RAM and CPU to run the distro you want to use), then while you should pretty much expect to be able to boot and run the distro, you should not be deluded into thinking that you will then be able to run some huge app, that requires even more hardware resources regardless of os platform.  Some people think that because Ubuntu says on the “label” that it’s “i386” compliant, that they can just dig that old T-20 IBM Thinkpad out of the closet (the one with the 256 Mb RAM) and begin dual-booting WINDOWS and Ubuntu on it, and then be able to run Google Earth from the Linux partition (which they couldn’t do from WINDOWS on that machine), and Presto!, Linux’s lower hardware requirements will just make-up the difference.  Well, sometimes this * does * work.  But other times it does not.  Or it is problematic.  Just because you’re booting into desktop Linux instead of the usual operating system, that is not necessarily going to mean that you can just open some big app—one that wouldn’t run from Windows on the same machine.  Though sometimes this will in-fact work.  It depends on things like the size of the RAM, how well the distro of desktop Linux you booted “likes” your motherboard, whether or not you might need a kernel-update, how hungry the app in question is, and other stuff.  More about this as we continue.
Yeah, there are bigger, hungrier apps than Google Earth.  But this is one of which everybody’s heard, so I just cite it as a handy example.
Bottom-line is:  once you get to usin’ Linux, don’t expect to be able to run some monster app that your machine had difficulty-with in WINDOWS, just because you’re in Linux now.  Linux is an * operating * System *, not an application.  A big application that was too resource-intensive for your hardware in a WINDOWS environment may still be too big to run in a Linux environment.  Or it might run just fine, because Linux * is * pretty good at using less hardware resources, * generally *.  As with every rule-of-thumb (especially as regards computers), there are exceptions to this.  Especially where it comes to CPU.  RAM can be a bit of a different story, depending upon several (or more) variables, and these are too arcane to go into right here.
My bottom-line fix for this, as I have indicated elsewhere in this document, is that you just consider a RAM-upgrade.  Yes, not much of any other source that you are likely to come-upon on the web is gonna do much to suggest a RAM-upgrade, just because you want to try Linux.  This “recommendation” on my part is * heterodoxy *.  And why would they?  These people’s main reason for posting a Linux-page in the first place is almost always to proselytize Linux.  And that is just fine.  They are often right in what they say, and the advice that they dispense.  But to my mind, a swap-arrangement (Linux’s rough equivalent of Virtual Memory) is just not a good substitute for sufficient RAM.  Period.  And most machines don’t come with the truly intended amount of RAM anyway; this would only increase their retail price, and so, because the computer hardware market is almost always so competitive and cut-throat, the major vendors have just gotten in the habit of selling you the car without the turbo-charger.  So-to-speak.  They (sort-of) expect us to pay-for the thing that makes it go fast later, after we have committed to their brand.
All I am saying is that I’d like you to consider it.  Do a little research.  The answers you come-up with might surprise you.

12. Entry 12:  CONSIDER THAT LINUX HAS CONSIDERABLE DIFFICULTY WITH SOME PERIPHERAL DEVICES:    Linux * is * a better system, but it has some hardware limitations that are just very difficult to overcome, probably because all hardwares world-wide are tested by their makers for WINDOWS, and not for other systems.  Some people say that this may be because a certain company or companies are exerting pressure on the device-makers, not to have them support Linux.  But I put it down to expediency:  because the computing market is so dynamic, you want to release that printer into the product-stream as soon as you can, and not fiddle with it in the company labs for another two weeks, just to make sure it will support a rival operating system.
I will say right here that I have yet to experience a situation where Ubuntu fails to recognize a USB mouse, keyboard, or multi-port hub; but I’ll add that Ubuntu 9.10 did not recognize my friend Jim’s * wireless * keyboard—though the wireless mouse was recognized.
It may be worth noting as well, that some (a few) makers of hardware devices ignore requests from the Linux community and the FSF, to obtain copies of the source-code for their drivers or other coding, or they are very slow in doing so.  But they always quickly reply to a request from the WINDOWS people.
Before you purchase some device or new piece of hardware for your system, check on the web, and scribble a short list of makes and models you’d like, which are said to be Linux-compatible.  LINUX SUPPORTS A HUGE VARIETY OF HARDWARES, with more support coming.  But not as many as the “ubiquitous ms WINDOWS”—at least not where it comes to just plugging the darn-thing in and using it.  Often times, there is a patch or a driver you can just download.  Research it a little, before you buy.  Sometimes there will even be a terminal at the department store, with web access, that’ll let you look it up.
Save your receipts, just in case you decide you want to return the device or equipment to the store or seller later-on.  This is a good policy anyway.
12a. Entry 12a:  SOUND CARD SUPPORT IS STILL NOT WHAT IT SHOULD BE IN DESKTOP     LINUX.  Even now (late in 2011).  And the multiple layers of abstraction that have evolved in Linux audio make it * bloody * difficult * to figure-out why your sound card doesn’t work, if it’s not working.  Better to go for workarounds—like trying different builds of the distro, different distros, different sound card, sell your laptop online, and try one with a more “Linux-friendly” reputation (probly a very last resort, since it’s hard to determine which lappy will be Linux-compatible as there are so many mo-board configurations); &tc.  This is a sticky issue in Linux.  And Linux, as I said, is kind of stuck with the multi, multi-layered abstraction levels for audio support which have developed.  This complexity can probly  be useful, where it comes to certain things (perhaps where certain programmers would like to improve support for special apps for special-needs communities—such as the blind and visually impaired (you could check-out ).  But whatever the reason (if any) for this evolved level of complexity in Linux’s audio sector, it makes it hard for us poor “Windows-folk” to figure-out why the sound isn’t working, when it doesn’t.
Myself, I can report only two sound problems, one major, and one very minor.  When I was still running Ubuntu 9.10 Karmic Koala on this new laptop, the sound was “dim”–I had to turn the volume way up.  A descent set of speakers would probably have solved the issue; but as Karmic was about to reach the end of its updates-support/life-cycle (EOL=”End Of Life”), I upgraded to the next release (10.04).  And the dim sound phenom just vanished.  The major sound problem I can report, is that on the old Thinkpad T-20 I own that runs Karmic Koala [well, I’ve now upgraded it to Lucid—but still no dice], the sound doesn’t work at all.  I bought this junker to use as a “computer-science Guinea-pig”  anyway.  And ms WINDOWS had been deleted from it, somebody having apparently done a quick installation of Linux, probly just to give it a quick and free operating system, just to demonstrate that the machine “worked”, so they could sell it.  So at some point I’ll probably try some other desktop Linux on it.  For now, I just use it once-in-a-while—to do a torrent (say of some relatively obscure Linux live cd that is unavailable as a .iso download), or to experiment with some risky process (re-formatting an external USB harddrive, experiment with bootloaders).  But this may take me awhile; the cd-drive is rather funky, and probly on its last-legs.  And its BIOS is too old to boot from USB.  This just opens the door, though, to me trying other experiments, like making a PloP boot-floppy cd for it, to allow booting from its USB connection.  All I need is more time……
13. Entry 13;  BACK-UP YOUR DATA:    Backup any data you don’t wanna lose, before you decide to do any “radical” operation to your files or computer, whether in WINDOWS, or with Linux.  Or any software-stack at all.  Maybe even data you don’t think you care about losing.  Back it up to removable media, like cd s or DVDs, and to at least one other source (like a reputable cloud service, or an external hdd).  Yeah, this is somewhat of a “hassle”.  But as they say in the Army, it’s “S.O.P.”  (Standard operating Procedure.)
14. Entry 14:  CONSIDER THAT LINUX HAS CONSIDERABLE DIFFICULTY WITH SOME PRINTERS:    Know now that Linux doesn’t support as many printers as Microsoft.  Not even with the CUPS printing utility as a go-to option.  Neither, by the way, does Apple, which also utilizes the CUPS program.  Lexmarks certainly do not have a good track-record with Linux—though in all fairness, I guess we should keep an open-mind, and see what happens with the newer models.  Lexmark support used to be almost hopeless in Linux; however, in recent months (coinciding with the release of Ubuntu 12.04 Precise Pangolin), Lexmark support has been gaining some ground.  As with all things related to PC (Personal Computing), the situation is “dynamic” (in other words, is always in flux, and changing all the time).  Especially when it comes to ** Linux ** Personal Computing.  It’s the nature of the beast.  We all enjoyed having the “stability” of Windows XP for about 12 ** years **.  But this was an aberration, an exception.  We have only had Windows 7 with us for about ** four ** years, and already Microsoft is poised to push Win 8 on everybody. And the interface is rumored to be significantly different.  So PC is in more-or-less constant flux, even in the Windows universe.
Cannons also have had a troubled history with Linux.  A good type of printer to use with Linux seems to be some HP model that just prints, and doesn’t do other stuff.  Epson seems to play well with Linux too—speaking generally.  Do some research, if you seem to have trouble printing with either of these two.  Many ppl online will try to help you.  Really, a great many (perhaps most) scanners can be got working.  There is the Simple Scan program, which comes bundled with most Ubuntu releases.  And there is CUPS (Common UNIX Printer Service), which is even bundled with Apple/MAC.  There are also other “helper” programs, like Xsane & others, to help Linux desktop users get certain peripheral hardware working.  Linux is always progressing, and new drivers are being published for free download all the time.
A lot of Linux ppl just invest in a printer with good Linux support.  There is what some UNIX people call “the e-Bay patch”:  sell the parts/devices that you can’t get to work with your Linux, and then use the money (plus perhaps a little added out of your pocket) to buy cards/printers/devices that research (Google) tells you will be supported by both your mo-board and Linux.  But I recommend that you try several Linux live cds first, because nowadays your chances are that you will download and burn one that will support everything, OOTB (“Out Of The Box”, as they say).  It might even be that the first distro you download and burn will work on your system.  Have a good look at “distro finder”, online.  If I’m not mistaken, the URL is:  http://www.linux-laptop.net/  I guess another one you could try would be http://www.linuxvirgins.com/  If the solution to something requires some book-length hack via the Linux Terminal/console/command-line, a newbie is perhaps better-off to just keep looking for a work-around.  You’ll probably find it, or think of one yourself. (By the way, for information on how to open the Terminal/command-line/console, see entry No. 51, below.)
Or just keep downloading and burning distros, until you find one that will run on your hardware and support all of its (important) devices.  This is why Linux uses the live-cd distribution format in the first place:  because it’s another help in getting a configuration that will run on your hardware.  Peripheral devices (web-cams, document-scanners) that are difficult might be replaced later.  You can always delete the .iso files, after you’ve found that magic distro, if you want that hdd space back.  When you find a distro that supports your hardware, you can proceed from there.  If the best you can find is a distro that supports all but, say, two of your most necessary hardware components [let’s say Ubuntu works with everything but your soundcard and your wireless card (wifi)]—then you will need to get online (like from an ethernet-connection, or another computer), and learn about your problem, and ways others with the same issue have solved it.  With this possibility in mind, I have named several help/support options and resources in the text below, under the headings “SUPPORT”.

It may help you to first try some releases of desktop Linux (Ubuntu, KNOPPIX, Puppy, PinguyOS, etc.) that were released in about the same time-frame in which your machine was manufactured. Often this is helpful. Let’s remember that a Windows computer is generally associated with the time-frame in which its version of the Windows operating system was current. A Windows 2000 computer will usually have been manufactured around that time; an XP computer will usually have been built somewhere between 2000 and 2008 or so. Machines sold with Vista were usually built around 2007 or 2008.
Somethin’ I’ll say here right quickly:  personal computing is a multiverse of competing standards, instead of there being one universal standard.  It’s like when VCRs became popular:  before the free-market shake-out produced the situation where there was VHS-tape in most of the country, and Beta-max on the coasts, the two competed with one another * in * the * same * markets *, and this was bad enough.  With computers, it’s worse, and this is another stumbling-block in the path of migration to desktop Linux.  Ubuntu, KNOPPIX, and Puppy Linux have added so much hardware recognition to their isos that they have done a tremendous-great-deal to overcome this impediment.  And the results show on them favorably.  BUT CD-DRIVES ARE NO EXCEPTION TO THE GENERAL “RULE” OF COMPETING-STANDARDS.  A desktop Linux cd that you downloaded and burned with another computer may not boot and run on your own computer.  But often it will.  The best strategy therefore is to download and burn desktop Linux with same the machine with which you expect to migrate to Linux with.  Remember too to use only quality cds or DVDs—the cheapies will often also result in “unusual” boot-error messages.  Also, it is standard practice to check the file for integrity BEFORE you burn it, using an algorithm—today most usually the md5sum or sometimes still the SHA1 algorithm.  There are many free graphical programs you can download to ms WINDOWS to do this, more about which later.
Links for reference to support what I have just asserted:
http://www.softpanorama.org/Commercial_linuxes/linux_cd_burning.shtml
and
http://forums.fedoraforum.org/showthread.php?t=148759
[see the post by u-noneinc-s, the third answer to the question at the beginning of the thread ]
Remember also that it’s best to burn on the slowest speed you can get your burning software to set-to.  More about all this later.

NOTE that any live Linux disc (cd or DVD) is gonna be  S – L – O – W – E – R  than when/and/if Linux is installed to and running from some other media, like an internal harddrive, or an external/add-on harddrive, or even a USB thumb-stick.  Slower to boot-up, and slower in running.  Not having enough hardware resources/”computing-‘horsepower” can also cause Linux that is being tried from a live cd to be slow, or to “hang”.  This is just something you have to suss-out.  Puppy Linux, however, is rather an exception to this, because Puppy is one of the comparatively few distros that is small enough to have ALL of the File System loaded into the computer’s RAM as the boot process finishes—and Puppy is pre-configured to do so.  It is because of this feature that Puppy Linux will run just as fast from a live-cd as from any other boot-media.  There are other distros that are like Puppy in this way, but they are comparatively few.  KNOPPIX also—though it does not load all of its FS into the RAM—is somewhat exceptional in this regard, as it normally runs as fast from a live-media, and the creators discourage harddrive installation anyway, because KNOPPIX is made to run from a disc or a thumb.  A VERY IMPORTANT NOTE/UPDATE TO TACK-ON HERE, IS THAT MOST USERS (by the time I’ve had a chance to insert this) ARE NOW PROBABLY SKIPPING THE * BURN * THE *CD * PART, BECAUSE THE USE OF WINDOWS-BASED PROGRAMS LIKE UNIVERSAL USB CREATOR from pendrivelinux.com, et al (free!), ARE INCREASINGLY PERFECTED, AND SO YOU CAN TRY-OUT DESKTOP LINUX FROM A BOOTABLE USB KEY YOU MAKE IN ABOUT 15 MINUTES, AND IT WILL RUN AT VERY NEAR FULL-SPEED!  This is now a * better * way to test-run desktop Linux, and it will usually install just fine from the USB-drive, if you decide to do so.  Further, it usually won’t hurt the USB-thumb drive (“pen-drive/jump-drive”), and you could re-use it later, for other stuff, just by re-formatting it back to fat32.  One caution about this method, though, is that you do run (some) risk of formatting the Linux distro to the wrong drive, if you’re   a) really inexperienced, or   b) really tired.  So be careful!  Or just stick to trying desktop Linux from a cd/DVD.  CDs and DVDs are still good, though, to make backup-copies of all your text, documents, batch, audio and video files (and your operating system(s) too), once you’ve installed the systems and programs you want to your disk.  Always make backup-copies of your Windows operating system on cds/DVDs as soon as you get a new (or used) computer, and then once again before you mess with Linux, BSD, or any other system.
Another thing, too:  you can run stuff to be printed through your WINDOWS install, or else maybe (if you are a hard-core WINDOWS/Bill-hater, and you just want to see ms squirm in the dust until it dies), you could try downloading FreeDos—which is rather like a free version of WINDOWS 95—and you could even run that from within your Linux, using Virtualbox, or another VM-ware, which are free.  There are a fair number of legacy dos drivers available for free download to Freedos; however, many are liable to be so out-of-date as to not be of much help with, say, a more recently manufactured printer.  Again, it depends on what it is—the make and model.  So I guess the bottom-line is still to research it, if it is not supported OOTB.  And consider that “e-bay patch”.
15. Entry 15:  Know right now, that due to the way modern microcomputers are manufactured, there are a huge number of motherboard configurations and combinations possible, and it is simply impossible for the Linux Community, with limited funding, to test Linux for all of ‘em.  Desktop Linux can be gotten to run fine on most major hardware.  I have seen not a few instances of somebody (not necessarily a geek) putting together a computer out of spare-parts, odds-and-ends, and some junk they collected—and then booting Ubuntu Linux on it, and all the hardware being recognized, out-of-the-box.  As I said, modern desktop Linux can be gotten to run on most major hardware. Especially with some support from the community-at-large, in terms of forums, and live-chat.  [Ubuntu live-chat support: http://irc.netsplit.de/channels/details.php?room=%23ubuntu&net=freenode or try Googling “#ubuntu”.]  This support/advice is free-of-charge.  A great many people have gotten it to run on obscure hardware, too.  It’s worth a shot.  BUT THERE ARE SOME MACHINES THAT JUST WON’T ACCEPT LINUX, OR SO ONLY WITH VERY GREAT DIFFICULTY. Keep downloading and trying different distros.  Start with the three I mention in this article, and/or with one of the recommendations someone may post in the comments-section (below).  See entries below for more helpful information, in this vane.  Note that you can, if you wish, just order most of these Linux builds from Amazon, or otherwise online, in the form of a ready-made cd.  If you do it this way, be sure and research the source of the cd itself.   Most of the major distros offer certified, “official” cds and DVDs.  As with anything you’d buy online (or anywhere), research it some first, and find out about the source/seller.  I don’t think there are very many actual scammers out there, where it comes to  ready-made Linux cds.  But I’d bet there are at least a few.
Remember, too, what I said earlier about a desktop Linux cd created on another machine possibly being incompatible with the cd-drive or other standards of your target machine.  The best strategy, arguably, is to download and burn desktop Linux with the same computer-and-cd-drive which you intend to use for your migration to desktop Linux.
From what I can glean (which maybe isn’t much), it looks like hp hardware is more likely to be  Linux friendly—though there are so many mo-boards and models out there it could still hand you a stink-burger.  At least at the time of this writing.  I also tend to favor Thinkpads.  Like maybe the x-series or T-series.  Read the comments below, as others are sure to post on this.  Everybody has a favorite brand.  But really, there are so many varied mixes of components available to the manufacturers, and so much of pc manufacturing nowadays is attempted in “real time” (as if this was not always the case), that it is really very difficult for us to pick-and-choose according to brands.  Sorry.  Toward the other end of the spectrum there are brands which have a less-good reputation when it comes to being able to support Linux, or support it well.  I don’t want to mention any names, but the initials of one of these “unfriendlies” would seem to be * Toshiba *.  I hasten to  add, however, that I am typing this on a Toshiba L515 Satellite laptop which dual boots Ubuntu 10.04 and WINDOWS 7 without much issue.  It does still have a few issues, but I may solve those with time.  Frankly, my real project is my netbook.
I will add that, with a broadband connection (in 2011) for my good laptop here, running WINDOWS 7 sp1, with 3 Gb of RAM, an Intel dual-core T4400 CPU with 2.20 Ghz, 2.20 Ghz, it takes me about 15 minutes to download the average version of Puppy Linux (iso format), and about 90 minutes- to-two hours to download most of the other noob-friendly desktop distros (microKNOPPIX, Ubuntu, Linux Mint).  And this in the background, while I am occupying myself with other work (or fun) in the fore-ground.  Free Video Download Helper/dw Download Helper for FireFox will stack any downloads for which you click “save” into its own queue, and will usually work on two at a time.  (Just don’t let the WINDOWS mach. go into monitor-sleep or hibernate, or it might “break-off” the download process, and it may have to be started again from the beginning.)  So you could be downloading Ubuntu 10.04, microKNOPPIX 6.4.4, and microKNOPPIX 6.2.1 automatically, while you play games online—DEPENDING MOSTLY ON HOW MUCH RAM YOU HAVE.
Again, everybody has their favorite programs and favorite hardware, and people will have their favorite downloads-managers for Mozilla FireFox.  I only mention free Video Download Helper by Mig because I’ve had such good luck  with it, and I’ve tried several.  Other people will post in the comments section with their own recommendations.
Do think about what I said elsewhere in this document, as to considering a RAM-upgrade.  If nothing else, it will increase the resale value of your tower or laptop.  And it always makes WINDOWS run at least a little better, because it has a little more meat to apply to anything you may be doing.  I guess I will add that I run Linux on four of the five computers I currently own, and they do the things I want just fine with no RAM upgrades—though the old Pentium 3 notebook was upgraded before I got it.  I have yet to run Google Earth on any of them, though.
16. Entry 16:   “RIVER” CAN BE BETTER THAN “POD”:      If you’re looking to get away from WINDOWS, getting an iPod is probably not your best bet. While there are many Linux programs out there that interface well with the iPod (AmaroK, GtkPod, etc.), iPods aren’t ideal for Linux, and you’re probably better off getting an iRiver or a Sandisk player. They tend to work well with Linux (without helper applications) and support drag-n-drop. iRivers, too, supposedly support the Ogg format (not just MP3).  (Although I personally tend to think the .ogg has been a project that didn’t turn-out so well, so I use MP3.)
Apple’s i-Tunes runs in Linux, however, and it does this well.  So you should have no problems using i-Tunes in conjunction with your Mp3 player.
17. Entry 17:  START USING THE MAJOR LINUX APPS WHILE YOU’RE STILL ON WINDOWS:     Begin using several of the following:  FireFox, Open Office office suite, GIMP photo and image manipulator, perhaps AbiWord, Apple QuickTime, OPERA, SeaMonkey, and other programs that are standards in Linux, but which have a WINDOWS version that allows them to be downloaded and used from WINDOWS.  Free of charge, just like in Linux.  VLC media-player, if you’re up to it.  (Otherwise Linux M-Player is sufficiently like WINDOWS MP—with one caveat:  you can create a playlist graphically, but you cannot save it to a file.  Not in a graphical way, anyhow.  There are instructions online, on how to do this using a text-editor.  But really, I find I am able to graphically create and save playlists using the VLC-player I installed to my Lucid Lynx, and this works just fine.  This seems to be how most people handle it.)  Picasa also has a Linux-coded version, and there are various ways to run Photoshop in Linux also, though the impression I get is that PS may be more difficult to run in Linux, as you probly have to use a compatibility-layer (like “Wine”, or “Runinlinux”).  It is also true that “nothing else is Photoshop”.  But I notice most ordinary/casual desktop users find Gimp, Picasa, and/or one of the others adequate.  Real Player also works in Linux (natively:  it has a Linux-coded version), and is not hard to install.  [UPDATE:  it appears that the RP people may be discontinuing their version for Debian-based Linux.  But RP isn’t necessary:  you can duplicate all its features with other, native Linux programs.]  If you are using Microsoft’s Synctoy program to sync/backup files, I guess I’d recommend that you try either FreeFileSync, or Grsync—two graphical programs that also have WINDOWS versions, so you could get used-to them while you’re still on WINDOWS.  There are many, many such backup tools available for desktop Linux.  More about such equivalent programs later.
By taking advantage of the fact that there are not a few graphical programs that run in both WINDOWS and Linux, you can help tp prepare yourself.  This way, when you arrive at your Linux desltop for the first time, you will be familiar with the most important stuff.  The interfaces of these programs are usually very similar, in WINDOWS and Linux—assuming you have the same version of the particular program in both—and often even if the version is a little different.  IceWeasle is the installed web-browser in KNOPPIX (and many traditionally KDE-oriented Linux distros); but it is just basically another version of FireFox, and it too comes from the Mozilla Fdn.  I will attempt to create a table of equivalents of WINDOWS and Linux programs in this database, when time permits.
18. Entry 18:  TAKE COMFORT IN THE SIMILARITIES:       The Nautilus files manager that comes in Ubuntu 10.04 (and some other distros) is very, very similar to the files manager in WINDOWS 7.  It’s very similar.  Or we could say that the files manager in W7 is very similar to the one in Ubuntu 10.04.  Same look.  Same feel—especially if you switch Nautilus from “icon view” to “list view”.  Searching with ctrl + F works the same.  The left view pane is practically identical, and works the same way.  This is also essentially true of the files-manager in the current KNOPPIX series, which is the 6-series.  Yeah, somebody who is really into this will post below, and reprove me.
The ROX-Filer files manager that is the default in Puppy Linux, is laid-out a little differently than what a WINDOWS user is used-to.  But it is not at all hard.  Boot it-up live, from the cd.  Play-around with it for half an hour or so.  You’ll see how it works.  I find it grade-school-like, in its simple look and feel.  I’d think a fourth-grade class could learn to manipulate it, in an afternoon.  Or less than that.  And there is a fair amount of documentation about it available, by googling.  And by the means I elucidate on this site.
NOTE that any live Linux disc (cd or DVD) is gonna be  S – L – O – W – E – R  than when/and/if Linux is installed to and running from some other media, like an internal harddrive, or an external/add-on harddrive, or even a USB thumb-stick.  Slower to boot-up, and slower in running.  Not having enough hardware resources/”computing-‘horsepower” can also cause Linux that is being tried from a live cd to be slow, or to “hang”.  This is just something you have to suss-out.  Puppy Linux, however, is rather an exception to this, because Puppy is one of the comparatively few distros that is small enough to have ALL of the File System loaded into the computer’s RAM as the boot process finishes—and Puppy is pre-configured to do so.  It is because of this feature that Puppy Linux will run just as fast from a live-cd as from any other boot-media.  There are other distros that are like Puppy in this way, but they are comparatively few.  KNOPPIX also—though it does not load all of its FS into the RAM—is somewhat exceptional in this regard, as it normally runs as fast from a live-media, and the creators discourage harddrive installation anyway, because KNOPPIX is made to run from a disc or a thumb.  A VERY IMPORTANT NOTE/UPDATE TO TACK-ON HERE, IS THAT MOST USERS (by the time I’ve had a chance to insert this) ARE NOW PROBABLY SKIPPING THE * BURN * THE *CD * PART, BECAUSE THE USE OF WINDOWS-BASED PROGRAMS LIKE UNIVERSAL USB CREATOR from pendrivelinux.com, et al (free!), ARE INCREASINGLY PERFECTED, AND SO YOU CAN TRY-OUT DESKTOP LINUX FROM A BOOTABLE USB KEY YOU MAKE IN ABOUT 15 MINUTES, AND IT WILL RUN AT VERY NEAR FULL-SPEED!  This is now a * better * way to test-run desktop Linux, and it will usually install just fine from the USB-drive, if you decide to do so.  Further, it usually won’t hurt the USB-thumb drive (“pen-drive/jump-drive”), and you could re-use it later, for other stuff, just by re-formatting it back to fat32.  One caution about this method, though, is that you do run (some) risk of formatting the Linux distro to the wrong drive, if you’re   a) really inexperienced, or   b) really tired.  So be careful!  Or just stick to trying desktop Linux from a cd/DVD.  CDs and DVDs are still good, though, to make backup-copies of all your text, documents, batch, audio and video files (and your operating system(s) too), once you’ve installed the systems and programs you want to your disk.  Always make backup-copies of your Windows operating system on cds/DVDs as soon as you get a new (or used) computer, and then once again before you mess with Linux, BSD, or any other system.
Most of your major keyboard commands that you are familiar with from WINDOWS work just the same way in most Linux.  Ctrl + F, ctrl + c, ctrl + v, ctrl + x, ctrl + z, ctrl + s, &tc.  Just as in WINDOWS, most Linux programs of any size and significance come with their own keyboard shortcuts (though of course these just supplement the ones listed above, generally), and such can be found listed online.  You can copy these lists to a document, for easier reference.  It is also possible to set your own, by “binding hot keys” (Google this).
Note that the equivalent to Windows Control Panel in Ubuntu (and most of its off-shoots) is Software Center, which can be accessed by clicking on “Applications” in the top panel/menu-bar.  You should then see “Software Center” in the menu that appears.
Software Center has two main parts.  One is “Get Software”, in which the shell usually opens by default.  The other is “Installed Software”.  This usually appears as an option next to “Get Software”.  Click it, and it should display a list of all the jazz you have currently installed to the system.  If you wanted to remove something, you could click it to highlight it, and then click remove/un-install.  I counsel against removing anything at all that originally came installed with the system, however.  These apps contain “dependencies” (similar to .dll files in Windows), and these are stored routines that may be necessary to be shared by other apps, so that these can fully execute.  Removing programs you did not install yourself can lead to dependencies-issues in Linux.
See the entry on “Stick to your distro’s Repositories”.
I don’t think most Linux distros use alt + ctrl + delete (the WINDOWS way to get to Task Manager, or to re-boot).  Some do.  (Ubuntu claims to have this enabled.)  Instead though, there is ctrl + alt + F1 (in Puppy Linux I think it’s ctrl + alt + Backspace), which will often work to take you to an X-window, and you can re-vamp things from there with command-line commands—provided you know some.  I will try to compile a list of the most important ones, as soon as time permits.  Until then, there is Google.  Or my document “linux common admin commands”.  Or http://www.omgubuntu.co.uk/2011/07/top-terminal-commands-newbie/ .  You should probly read the comments too.
NOTE that if you should enter the “x-window” environment (black screen) in Linux Recovery Mode (like, say, from a Grub screen—Grub being the add-on boot-laoder/chain-loader), then you will BE ROOT/Admin, in whatever you do.  This kind of thing is necessary in WINDOWS, because certain recovery/restore functions in a WINDOWS system will only execute with Admin. permissions.  But Linux is different—especially modern distros, which really do not need you to log-in as root, even if you had to perform certain recovery functions.  And such as these are rather a rare thing to have to deal-with in desktop Linux anyway.
Using ctrl + Alt + F1 DOES NOT MAKE YOU ROOT (Admin.).  Nor do you need for it to.  In order to become the true Root/Admin. in an x-window/black-screen environment, you would first have to type the name of the True Root, and then the real Root password, which in Ubuntu, Mint, and Ubuntu’s variants, is hidden.  And not necessary for even the machine’s owner, because of the SUDO utility.
[A way around—to do this anyway—would be to boot the machine, and use the up/down arrow keys to select a “recovery mode” kernel option from the Grub screen.  This lets you be Root/Admin (aka run-level-one) in the black, x-window environment.  And you may not even be prompted for the Root password, or any password, to make changes from this environment, because Linux may just assume you must be somebody special, and with access to the actual hardware of the system (in other words, not a black-hat hacker, who is trying to gain access over the internet), or else it couldn’t have booted into recovery mode.  But I do not think most distros are gonna let you boot the recovery-mode, and then shift to a graphical environ while still assigned as Root.  Nor should anyone but the most experienced person attempt to do ANYTHING from this mode.  And I think there is a way to set Grub’s permissions so that a password is required to enter this mode.  More about this later. ]
An exception is Puppy, which it seems is safely run from root/Admin. all of the time [at least as long as one does not do the “traditional-type uncompressed harddrive install—which is not the “default” in the Puppies anyway].  But then Puppy has been (very cleverly) thought-out to “head-off’ and “work-around” many of the common pitfalls that do exist in desktop Linux (for us noobs, at least)—even with the modern desktop distros.  And Puppy Linux was created with the intention that it would run only as a compressed, live-only file-system.  More about this later.
Note that the terms “bootloader” and “chain-loader” are sometimes used interchangably.  This is somewhat a term-of-confusion, but for * most * of our purposes here, it will not matter that much.  I have written a brief clarification of this in the tip-list (below), under the entry to do with “CONFUSABLES”.

Do be aware that when it prompts you for your password in this “X-window” environment, the internal clock is usually set to allow you only one minute to type it in & then hit Enter.  If you mess it up, most systems will give you as many tries as you need.
Know also  that the X-window environ will not even display dots to represent the characters you are typing, * WHEN * IT * COMES * TO * YOUR * PASSWORD*.  So it looks like nothing is happening, if you have to type-in your PW in this environ.  Just be patient, be on-the-ball, and mentally keep your place.  X-window will show the characters you type for all other operations.
NOTE that there is also the “magic sys-req keys”, a.k.a. R.E.I.S.U.B., which may not work in some distros unless the setting is enabled by you ahead of time, once Linux has been installed.  Sometimes it fails to work anyway, and your only option is a “hard reset”, which means holding-down the power-switch on the computer’s  case with your finger until the dern thing finally turns-off.
Google “enabling magic sys-req keys in Linux”.  Or view the document “l ubuntu emer commands n shutdown” in my database.  This is another method to safely shutdown Linux and re-boot.
BE AWARE also that when you click your mouse or otherwise give Linux a command, LINUX IS A LOT MORE ASSUMING, in taking it for granted that you’re on-the-ball, and that you really MEAN to delete those files, or uninstall that application, or whatever.  LINUX DOES WHAT YOU JUST TOLD IT TO DO, OFTEN WITHOUT ASKING YOU FOR CONFIRMATION.  Or it may only ask once.  Linux does not “take into account” that you may be tryin’ to talk to your wife about your toddler-son’s baby clothes, as to what should be done with them, while you’re working on your computer at the same time.
SO LINUX IS LESS LIKE A NANNY.  I’m not sure if this is a good or a bad thing.  But it’s a general fact.  Ubuntu (at least 10.04) is more of an exception.  Ubuntu seems more likely to give you a pop-up for confirmation—and sometime more than one—reminiscent of WINDOWS.
Be smart.  Print at least some of this out before you start using Linux.  Or find a way to have access to at least some of these instructions from outside of your computer beforehand, so that you will be  able to consult them if you temporarily get into trouble because of unfamiliarity, or some hardware compatibility issues that need sorting.

19. Entry 19:  BROWSING FROM LINUX:
*  IF * FireFox is buggy or breaks (quits working—though this probly won’t happen), I have a few suggestions.  You could try uninstalling and then re-installing FireFox.  Re-installation is best done from the repos.  If this is ineffectual, I’d try a newer version of the browser.  It is also true that you can install or re-install many apps (like a web-browser) from the Terminal/command-line without the help of any browser at all.  This is (sort of) one of the differences between WINDOWS and Linux, and can be accomplished by means of an apt-get command, or commands used with other utilities that probably came installed with your Linux.  So FireFox can be installed from the command-line.  Do not be intimidated by the command-line/Terminal.  This is a reliable way to control a Linux, and to change settings, when GUI-means seem to be becoming problematic.  Check your user manual, or this database, or use the man command, to find out beforehand whether or not you have a particular utility included in your install or disk.  See entry No. 51, as to how to open the Terminal/command-line/console.
An IMPORTANT CAVEAT I will insert here, is that you should understand and know the bare basics of command-line/Terminal commands BEFORE YOU USE ANY, and cross-check commands using different sources (webpages, books) to verify that somebody isn’t playing a nasty joke on you, by suggesting a malicious command—though I have never seen this.
Here is a link to a nice resource for us newcomers to use to begin familiarizing ourselves with the Terminal.  You needn’t read this whole set of webpages (which really amount to a free online e-book), but rather enough to help familiarize yourself.
http://linuxcommand.org/learning_the_shell.php
It may be tempting to blindly type “commands” you found on some web site, expecting that they will do the described task. However, this sometimes fails just because you have a newer version, slightly different hardware or another distribution. You could try to execute each “command” with the –help option first, and understand what it is supposed to do. Then it is usually very easy to fix various small problems (/dev/sda -> /dev/sdb and so on), achieving the desired goal.
Do not run rm -rf / or sudo rm -rf / unless you are seriously considering deleting all of your data.  Run the command ‘man rm’ for more info.  The “man” command (without the quotes) will give you the user-manual information for pretty much any program on the Linux system—provided the makers of the app or the distro included it in the distro.  Most often it will be in there, and “man” will display it.  For example,
man vlc
and then hit Enter.
This may display the contents of the VLC media-player user-manual on your Terminal—assuming VLC is installed to your system in the first place.
Note that Linux man pages are of a more technical nature, generally, than most of us ordinary WINDOWS operators are used-to.
I think it’s the down-arrow key that pages down in Terminal.
In most distros, you can highlight stuff in the Terminal by dragging the mouse—either from right to left, or vice-versa—just like normal.
Other functions (like copy and paste) don’t work exactly the same way.  See the entry pertaining to “PASTING INTO AND OUT OF TERMINAL”.
Similarly, don’t create a file named ‘-rf’. If you run a command to delete all files in that directory it will parse the ‘-rf’ file as a command line option and delete all files in the subdirectories as well.
If you want or need to save some “man” instructions from a display (called a “print”) in Linux’s Terminal/command-line, one way is:
# PAGER=cat
# man less | col -b > less.txt

Another I guess would be (to save it as .pdf):
man -t awk | ps2pdf – awk.pdf
Notice that that long vertical bar after <less> and < man -t awk> is the symbol from your (full-size) computer keyboard known as the “pipe”.  It is found on the same key as the back-slash we use in Windows.  (So it’s that you hold down SHIFT, and punch the “ \ “.  )
Other means are available online.

I guess I should add here, that I have uninstalled FireFox from my Ubuntu 10.04 (and with the “purge” command, which is a sledge-hammer to kill a fly; I probly should’ve used “remove”), and then re-installed it using the appropriate Terminal command (sudo apt-get install firefox).  Twice.  I think I even got a slightly more up-to-date version of FF the second time.  (I should add that I had to uninstall it and re-install several times in Windows Vista, and I know several XP users who occasionally have to do the same.)  Turns out FireFox was not the underlying problem in my Ubuntu all along.  It was other stuff, which I go-into elsewhere in this document.  (Although I * have * mostly switched to SeaMonkey for my Ubuntu 10.04 on this machine, with FireFox running, but minimized:  why this is better may have to do with the need for a custom kernel for this particular laptop.)  But I just want to say for right now that if you go uninstallin’ FireFox from Ubuntu (especially with “purge”, which is probably overkill), I WOULD NOT delay in re-installing it.  Its associated dependencies may be needed; Ubuntu will probably compensate for this by loading Epiphany, or whatever the next one is in the “batting order”, so that * ITS * dependencies can compensate.
But I’d prefer to just re-install FireFox without delay—even if you’ve decided to start using something else as your browser of choice.  If you would rather un-install some program from, say, Ubuntu 9.10 or 10.04, and still leave its ancillary-stuff in the system (like its dependencies, which other programs may need in order to run properly), then I think “sudo apt-get remove” is probly the command to use, if you’re not gonna do it graphically.  As in “sudo apt-get remove firefox”:  & then hit Enter.  That simple.  Remember to type it into the Terminal WITHOUT THE QUOTES.  And you might wanna check me on this “remove” command:  I just cited this one from memory, because I couldn’t find the reference in my notes just now.
But speaking as a “noob-boob” who doesn’t really know what he’s doing with respect to Linux, I have to say that I tend  to gravitate to defaults.  And FF is the default browser.  I am also loathe to play-around, to  experiment too much, because my primary aim is to get this sucker goin’ for some productivity.  Read the posts below:  we can both be instructed by  the flaming of more experienced ppl.
Myself, I will say here that I have had at least as good an experience with SeaMonkey (also a Mozilla product, and a first-cousin to FireFox).  I have SeaMonkey running in both WINDOWS and Linux, and I find that it seems to do well in Linux.  Many of your favorite FireFox add-ons will run just fine in SeaMonkey, if you add them to SeaMonkey.  Many will not.  You add these add-ons to SM just like you would in FF:  you go to the SeaMonkey add-ons page, and search for the one you want in the site’s search-window.  Sadly, many favorite FireFox add-ons will not be available to SeaMonkey, as the community is smaller.  One hopeful thing about Linux/FOSS is that, if you get into it enough, you can write your own SeaMonkey or FireFox app(s).  SeaMonkey is easy to use, and can be set-up to be very secure.
In SeaMonkey, you do stuff more from the menubar, as add-ons don’t always provide a toolbar icon by default.  Maybe this is why some people have said SeaMonkey to be more of a “professional’s tool”.  If by this they mean it is hard to use, I cannot agree with them.  If you can do your banking from an ATM machine, you can learn to use SeaMonkey, and this probably by just fiddling with the controls.  You should not be afraid to fiddle with the controls anyway, to a reasonable extent, in Linux.  You should probably set aside some “play-time” just for this purpose.  Because doing this (to a reasonable extent) will help you learn.
An example of using SM would be that, for instance, I added dw free Video Download Helper from SeaMonkey add-ons, but I have to go to “tools” > download helper > media > download, in order to download some You Tube video I’m watchin’.  Whereas in FireFox, the same add-on provides its own (animated) logo, with an adjacent drop-down menu, which is an even more graphical way of enabling this function.  Still, all of this is done with the mouse, so it is  still GUI—no command-line stuff in SeaMonkey (unless you want to).
You might take note of the fact that if you want to highlight a URL in the address-bar of FireFox or SeaMonkey * in their * Linux * incarnations * , it does not automatically highlight the whole thing, just because you clicked in the address window.  Rather, the way I do is  to click at the right end of the text, and, when I see the cursor-bar flashing, I hold-down the left mouse-button, and then drag to the left-and-slightly-upward at once, and it usually highlights it all.  Or else just keep dragging left.  If it is one of those that takes-up more than the length of the address-window, I hold down the left button, and drag right slowly, to the end of the right side—and slightly beyond it.  This will cause the text to “slide”, visibly, beneath the cursor’s held position, until all of it has been highlighted.  When all is highlighted, you can then use ctrl + c to place a copy of the URL onto your Linux computer’s clipboard.  Then you can paste it where you want.  Either method should work to highlight a URL in either browser.
Just because you are in Linux, it does not mean you should be careless as to browser add-ons.  Research anything you intend to install to your web-browser, just as we are supposed to do in WINDOWS.  The Mozilla Foundation makes this easy, because a link to user-reviews is posted somewhere in the add-ons webpage.  And there’s always Google.  Type the name of the proposed add-on into the Google search-window, and add “user reviews”.
I guess you can install the Mozilla plugins (Firefox plugins) for VLC or Mplayer if you want.  But in truth, the default Totem/M-Player plugins perform much better in rendering videos on websites.
Just as in WINDOWS, there are ways to control scripts that want to run from a webpage, and other settings and means to make your browsing more secure.  This is a good practice, even on Linux syastems, and is not that difficult to set-up.  The 5- and 6- series of microKNOPPIX come with a lot of this functionality already set.
IceWeasle is just the version of FireFox that usually runs in a KDE desktop.  That is probably why current versions of KNOPPIX (6-series) retain it—KNOPPIX historically was known as a KDE distro.  The contemporary KNOPPIX 6-series comes with the LXDE desktop, which is pretty nice new desktop environ made for Linux.  LXDE is “lighter weight”, in that it is hungry for fewer system resources than KDE, GNOME, or some others.  It got some bad reviews, initially, for not being ‘intuitive” (being hard to figure-out); but I find that in microKNOPPIX 6.4.4 (at least), the layout is just as “intuitive” (to * ME *, anyhow), as WINDOWS XP.  And the 6.4.4 desktop has a Start button in the lower-left, much like Microsoft Windows.  You can, however, replace it with KDE, if you want.  Just be sure your computer meets the recommended hardware requirements.
Google chrome browser has a “family” of versions that can run in Linux.  There is Google Chromium browser, which is rather a version of Google Chrome that is coded with all open-source code (at least that’s what I’m told); and Iron, which is all open-source, plus some nifty  extra features (and a few “experimental” features).  As these “Chromiums” are very lightweight (speaking generally), many people like  them on less-powerful hardware platforms, like notebooks, netbooks, and tablets.  These third-party open-source re-mixes of Google’s Chrome browser also usually “bleach-out” Google’s tracking and key-loggering software, which is still present in the regular Linux version of Google Chrome.  If you read the Chrome EULA  carefully, it will probly tell you in there somewhere, that Chrome tracks your usage-habits.
It’d be nice if Apple/MAC’s Safari web-browser would run in desktop Linux, but I don’t think it does.

20. Entry 20:  PLAYING A DVD
Some Linux distros come with this capability installed.  Many do not.  At least when it comes to certain commercial formats, which would impose certain requirements upon the distro, in terms of EULAs and/or logos needing to appear on the desktop the first time you boot it, and other IP (Intellectual Property) issues.  (Remember that it was also probably true that you had to install Adobe Flash and Java/JRE to your new WINDOWS computer, once you got it home from the store.)  Anyhow, you still want to watch your movie.  I counsel that you just try the DVD first.  Of course, I’m assuming that you’ve already installed Ubuntu Restricted Extras package, if you are using regular Ubuntu, and not one of the variants (which often come with this package already).  If you still can’t get the DVD to play for you. There is information as to additional codecs, available on the web. This issue is easily fixed in most cases, by just downloading and installing a package or two.  See entries 42 and 44, and https://help.ubuntu.com/community/RestrictedFormats .  These type of packages (to enable functionality for popular multimedia formats) are usually configured to be very easy to deal with, and take only a few minutes.
21. Entry 21:  DON’T KILL YOUR WINDOWS!         Do not over-write or otherwise dump your WINDOWS install.  If you decide to do a harddrive install, I suggest going with a dual-boot arrangement. Which is the default anyway, in most Linux installer “wizards”, which are activated from a live cd when-and-if you click the install button that is displayed.  Because even if you get so used to Linux that you rarely boot Windoze anymore, it can still come in handy once in awhile.  And we need it for awhile, anyway, to hold our hand as we cross the busy street, until we finally arrive at Linux saftey, on the far side.  WINDOWS is a decent fall-back position.  In Vista and 7, WINDOWS can re-size itself while it is running, from WINDOWS Disk Manager.  And WINDOWS can still, for example, serve as a good “hacker-bait”/honey-pot in a crowded coffee-shop, if you want to draw firesheepers away from your real ip.
It is also wise not to over-write it for monetary reasons:  someday, you may want to sell your laptop, in order to buy a better model.  You and I may sort of understand Linux (and like it); but the world is, well, the world.  Many, many people do not understand much beyond Start > My Documents.  And they are not anxious to learn.  Most people (perhaps sensibly) are just consumers of these operating-systems, and just aren’t interested in doing, well, more of the things you can do with a computer.
It is also true that some computers have great difficulty with this “dual-boot” arrangement, despite all that has been done to make it smooth.  I have not had such difficulties myself, on any of my machines, or those I’ve installed a dual-boot on to:  but this is a very small sample, statistically speaking, if I have not said this already.  What I tend to recommend, in any case, is that one A) try at least a few distros as live-cds, so that Linux is tried from the computer’s RAM, which is temporary and ephemeral; B) when you think you’ve found your distro-of-choice, install it to a USB thumb-key and try using Linux from that.  Another worthy option is an external, add-on harddrive, or a micro-drive (mini-external-harddrive).  That way, you are not called-upon to make any permenent changes to your machine’s (internal) harddrive.  You could always install to the harddrive later, if you wanted.  See information in this database as to BIOS-booting Linux.  (It is also true that a lot of people run Linux as a “virtual machine”, using virtualization software, such as VMWare.)
Note that as we want to be able to boot and use either system (WINDOWS or Linux)—however we ultimately decide to set-up the booting of Linux—if you have a clipboard manager installed/enabled IN EACH SYSTEM, you’ll find the use of documents that you open on both systems to be easier.  Most builds of Puppy Linux and it’s variants seem to come with a clipboard manager on the desktop, by default.  If you can’t find one for a linux distro, various ones can be gotten from your repositories.
22. Entry 22:  CHOOSE FILES-FORMATS THAT WILL WORK ON BOTH PLATFORMS    To date, I have  converted all my documents to .odt (this is Open Document Format).  I used .doc almost exclusively for about a year, even after switching to Ubuntu as my workhorse operating system.  Yes, this is still a proprietary, ms format.  But it can be read reliably from any system in the civilized world.  Probably from * any * system in the world.  But (by now, at least), it seems .odt can also be read/written-to by just about any operating system—Linux * or * Windows.  Versions of ms Word 2007 + higher save to .docx, by default.  But this .docx format may not be supported in your Linux—check to make sure; really, since roughly Ubuntu 10.10 or so (roughly 2010), .docx is supported in Linux—both the Debian/Ubuntu stuff, and Fedora/RPM-based.    I’m not even sure if all Apple computers can read it, but I would think so.  Since the time I first wrote this, compatibility of office programs in Microsoft and Linux has improved:  by 2011 or so, only people with * old * versions of MS Office (& the like) would seem to be in much difficulty; MS Office 2007 + up seems to be able to read .odt just fine, and Linux’s flagship Libre Office can read .docx, as can Open Office 3.2 & higher.  So just about everybody can just breathe-easy now, where documents-formats are concerned.  Know also that older versions of ms word (like prior to 2007) cannot read .docx, by default.  As for ms documents apps before the 2007 version, ms may let you download a patch for this, free-of-charge:  but I’d have to check that.  Just as likely, they’d tell you information on how you could purchase a newer copy of ms Word, or ms Office.  At a “reasonable” price, of course.  And of course you have to keep in mind that if you were to purchase a newer version of, say, Word, you might also have to upgrade other stuff, to support * that *.  Stuff like your operating system and your computer itself.  Linux, by contrast, is a lot less likely to force such expensive requirements on you.
There are macros/scripts available on the web, to run from the dev tab in Word, in order to convert .docx to .doc., as a batch/mass-convert.  As with any radical operation, research it first, and backup all data that could be affected to a cd or DVD (and ideally at least one other source, like a cloud) before you begin.
Know also that the .doc format (a.k.a. “ms Word 1997/2000/2003”) has been notorious for various spywares, and some malwares.  I’ve switched to .odt, and that’s the format for most of the files in my Dropbox cloud.  I access it from both Linux and from Windows 7, when I occasionally boot that to do some certain operation I was having trouble learning to do in Linux—though these boots of ms Windows are becoming few-and-far-between.
If you are familiar with the function, you can use a Macro in Word 2007 to convert documents from .docx to .doc (or .odt) as a “batch process”—perhaps letting it run overnight.  There are other Batch-files methods available to convert a whole bunch of them in one fell swoop.  Just look around on the web.
I find that Open Office Writer 3.2—which is what you use in Lucid (Ubuntu 10.04)—seems like  the equivalent of something from 1998, though I hasten to add that this can actually be * better * in several ways than some of the later word-processors—these being somewhat overdone, from the standpoint of an ordinary home-user.  I really should  try to install Libre Office—I wonder if there’s a version of it that runs nice and stable on Lucid?  Open Office 3.2 is nice.  But it has a harder time opening and closing big docs with a lot of formatting (.doc, I emphasize:  I’ve switched to .odt/.odf, since I first wrote this, and the issue essentially went-away in Linux—but now I have it in Windows 7’s MS Office Word; Word 2007 stumbles-and-chokes, taking noticeably longer to open my documents now that they’re formatted to .odt).
My solution was to download and install Open Office 3.3 to Windows 7.  Now it works smooth.  But if you’re going to have to work part of the time on a system that won’t let you install Open Office (because the person in charge of the equipment won’t allow it), then maybe you’d better think twice about migrating your documents to .odt.  I’ll add only that .odt takes-up * way * less space on your harddrive (or any location) than .doc, and is apparently much more secure (Google it).
Features are a little more cumbersome (to get used-to) in OO Writer 3.2, and some are lacking [where is “toggle case?; “continue numbering” (where you’ve left-off previously) is more involved than in ms Office Word 2007].  It is also true that I had to fork-out about 130 USD to Microsoft for its Office suite, and it jammed-up one of my big, highly-formatted files in .docx, and I could only recover it by closing it with WINDOWS Task Manager, and then re-opening it (the next day) with ms Works.  Then I had to re-create it by successive copy and paste back into Word ‘07.  And even then I could not recover all the images, so those were just lost (note also that this was when I was still having to use MS’s Vista operating system, as I had not picked up my Win 7 machine from the store yet).  So perhaps it should be said that neither system is perfect—neither Linux nor * Windows * .
In either case, if I had only been using Dropbox at the time I was creating this file in the first place, I could’ve just gone to my cloud and retrieved a slightly older version, and saved most of the lost images.  I must say I find Dropbox works every bit as well in Ubuntu and Mint as it does in Windows 7.  And I will add that I am a user who has more than 800 + documents of all sorts (my co-major was Divinity Studies)—several running into more than 150 pages, with images and screenshots.
Nothing against .odt—really, the whole world should be using this standard.  But as I have said, some Windows users (particularly those with older versions of ms Office—and there are plenty of them out there, at the time of this writing—2011), well, some of these persons may not be able to open .odt without installing some update or plug-in, or they may just object to having to deal-with this type of “weird” format on a regular basis.  Many ppl are very conservative, where it comes to their Windoze install—perhaps to a large-measure because they have just learned to fear “viruses”—to live in fear of “viruses”—what ever that means to  them, for whatever reason.  F.U.D. (Fear, Uncertainty, and Doubt) is a factor in computing—even in the Linux meta-verse; but it is more pronounced in the Windoze meta-verse, and this for cultural reasons, as much as because of the software itself.
By the way, I routinely limit myself to about 50 pages, where it comes to the size of documents.  This is a habit I got into while using microsoft Word and also msWorks, because it makes it easier for the program to open the document, especially if it is highly formatted (has lots of links and images in it).  One might say that this is a good practice on any platform, especially if you are not yet so proficient at personal computing that you don’t need to occasionally consult instructions-files, some of which may be stored as documents-files:  documents with a significant amount of “formatting” (images, hyperlinks) are harder for pretty much any system to open, and I find that trying to keep them below roughly 50 pps just makes my life easier in the long-term.
Deleting the side-bar comments from .doc s into which I have pasted formatted stuff in OO Writer also speeds things up a great deal, and I’ve never had occasion to use the “comments” for anything anyway.  Just horizontally scroll to any one of the comments, then click the little drop-down arrow it has with it, and click “delete all comments”.
23. Entry 23:  LET YOUR HEAD—OR AT LEAST PART OF IT—BE IN THE CLOUDS:        Like most major apps, DropBox has a Linux version.  I find it works wonderfully well in Karmic Koala (a legacy release I have on an old laptop I keep around), and Lucid (Ubuntu 10.04.2).  Some of the “major WINDOWS apps” that we think of today, by the way, really began on Linux!  I find that using DropBox was a great way for me to import my then 300 + documents to my Linux, once I had converted them from .docx format to .doc (WINDOWS 1997-2003 format, which pretty much any computer can read).  And it gives me the added bonus of being backed-up automatically to my own (free) 2 Gb DropBox cloud.  I can use my documents in both my WINDOWS partition and Linux, and they’re automatically synced for me!!
Note that Ubuntu One is Ubuntu’s own cloud service, and that (as of the time of this writing), they give you five (5) Gb FREE storage, instead of 2 like DropBox.  And Ubuntu One now has a WINDOWS client.  I’m gonna give this feature a test-drive, when I get ‘round to it.  I will add only that my experience over the years with DropBox has been nothing but GOOD, and this is lucky for me, because I have a LOT of documents (for a casual user), and I am very touchy about them possibly getting corrupted or lost.  I have nothing but good things to say about DropBox.  DropBox has been VERY stable and reliable over the years, which is why I continue to use it.
Note that as we want to be able to boot and use either system (WINDOWS or Linux)—however we ultimately decide to se-up the booting of Linux—if you have a clipboard manager installed/enabled IN EACH SYSTEM, you’ll find the use of documents that you open on both systems to be easier.  Most builds of Puppy Linux and it’s variants seem to come with a clipboard manager on the desktop, by default.  If you can’t find one for a linux distro, various ones can be gotten from your repositories.
Note that there are many other “cloud services” as well.
There is SpiderOak, which sounds like a tree you wouldn’t want to sit-under, but which has a pretty solid reputation among users of forums and chat-rooms devoted to this sort of thing, and which also offers a 2 Gb free account, if memory serves me right.
Binfire.com offers a CHUNK of free space, and I have used it in the past, with some descent results.
Of course there is Google Docs, which is always developing new features to serve us.
What I am going tom say next may be Linux-heresy, but Microsoft will offer us 7 Gb of free storage space in our own “cloud”, if we just have a Hotmail account.  And you don’t even have to use the –mail feature hardly at all, to keep it open.  It has worked, too, even though I have been interfacing with it from a Linux machine.  Though, sometime after I got my latest laptop, and got it set-up with Ubuntu 10.04—as well as Linux Mint 12, and Backbox 2.01 (since upgraded to 2.05, because 2.01 destabilized on me—though this was after I installed a few things; BB is not so much intended as a * desktop *, anyhow)—well, I’ve since found that I can read my Hotmail, but when I try to open my skydrive, it won’t work from any of my Linux.  This may be due to having somewhat older versions of FireFox in Linux, as that’s what you often get in desktop Linux distros that aren’t said to be “bleeding edge”–but which are rather put-together by their makers for greater stability.  I,m gonna try accessing my new Skydrive cloud from my Windows partition, and I’ll update this as soon as I can, to let you know how it went.
24. Entry 24:  Google EARTH          This just is not a program I much use.  It is a bit slow, even on my good laptop with 3 Gb. RAM and a pretty good Intel core-processor.  [UPDATE:  the latest version runs ever so much better from my install of Linux Mint 13 XFCE Edition—and on ** my ** netbook! **.]  I had a look at some installation instructions online (when I was using Ubuntu 10.04), and it seemed like a lengthy procedure.  But this was some time ago (years, actually).  To-day, I find Google Earth is usually easy to install, where it comes to the LTS releases of Ubuntu, and most of the variants that are based on either 10.04 LTS or 12.04 LTS.
25. Entry 25:  ALPHA AND BETA:    What do they mean by software being in “Alpha”, or in “Beta”?
Put very simply, any software (even early versions of WINDOWS) is first released from a software laboratory to users as “Alpha 1”.  This population of users is often something much less than the general public, but rather a limited pool of persons or companies selected by some means for this purpose.
Later on, it gets to Alpha 2, and then to Beta 1, and then Beta 2.  Sometimes development stops at Beta, just because it is working well, or because the developers never get around to changing the official name of the software.  This is the case with Grub 1.97 bootloader, which “never officially made it out of Beta”—though for all practical intents and purposes this version of Grub is a final release, and very stable.  It has been replaced in recent years by the so-called “Grub 2”, which also went through this release-cycle while in development, and is officially “Grub 1.98”.  But people just call it Grub 2.
There are more programs out there than you’d think, that have never officially had their Beta designation removed, though they continue to function as if they had.  But you’d better research a program that is tagged as “Beta”, specifically to find out if a normal release is in the works.
Just to confuse us a little more, the software developers sometimes also use “Release Candidate” (abbreviated “RC”), which from what I can gleen is one of the last steps in the development process, before the software “makes it out of Beta”.
26. Entry 26:  A WORD ABOUT DESKTOPS AND CHOICES:
As I have said, I am not a hater of Microsoft or its WINDOWS operating-system for desktop computers.  Both systems have virtues.  WINDOWS will probly always be easier to use.  At least until you get a particularly bad virus or trojan, or a worm or a root-kit, or some spyware or some other malware.  And pretty much anybody can look-inside a WINDOWS system.  Even your 13 year-old granddaughter may know how to use a files-restore disc to look at copies of files you deleted—because when you delete any files from either system (or have even just played them from a cd or DVD), they “leave tracks” on your harddrive.  There are plenty of hacker-tools available for download off the web nowadays, and most of ‘em are free!  And probly 95% of these are aimed at WINDOWS.
I guess I will try to be brief here, and try to just state the minimum you need to know.  You can look for elaboration elsewhere in this data-base, or on the web.
Please try to remember that some of these personal computing terms are used with an abandon that is rather confusing, at first, to us noobs; some terms have two meanings, and you just have to know from context.  [ the entry “OTHER CONFUSABLES” is next, but you can skip it until you need to refer to it later, if you want. ]
Desktop and desktop:
A desktop computer is one that will fit on top of your desk (instead of taking-up a whole building).
Your computer desktop (also called your “desktop environment”, sometimes abbreviated “DE”) is what you see when your computer finishes booting-up.  In microsoft WINDOWS, it’s the last screen you arrive-at, which shows the WINDOWS “Start” button, in the lower-lefthand corner of your screen, and your icons, your “theme” or “wallpaper”–that’s the decorative scene or design you can change, to personalize the look.  This screen is your computer’s “desktop”.
In microsoft WINDOWS, you essentially have the option of only the one desktop environment microsoft provides—or at least this is what many Windows users seem to think.  Yes, you can change the “wallpaper”, which is another term used more-or-less interchangably with “theme”—and, if like me you’re old enough to remember (some would just say “over-the-hill”), there is also the term “screen-saver”.  These three do not all amount to exactly the same thing—but they are close enough that people sometimes use them interchangeably.
In WINDOWS, though, other desktop stuff is generally less often something people change.   I guess you can move the WINDOWS Start button to the other side of the screen, but I have never heard of but one or two people having done it.  You can move your icons on your WINDOWS desktop, and delete some, or create new ones.  Docking can be manipulated without too much trouble at all (depending on which WINDOWS version one is using).

But if you wanna move the desktop switch out of the taskbar, or move the status-bar, or download a DE that will use less resources—well, these things are done more routinely in Linux.  Frankly, all these * can * be done pretty much in Microsoft WINDOWS–XP, 7, WINDOWS 98, and so-forth.  There are in fact not a few alternative desktop environments one can download and install to Windows 7, and other versions of Microsoft Windows.    [Here’s a link pertaining to this subject:  http://www.lockerGNOME.com/windows/2012/03/14/should-you-consider-alternative-windows-shells/ ]  Many alternate, third-party Desktop Environments for WINDOWS are free-to-use programs, too, just like in Linux.  But it seems most users of Microsoft WINDOWS just use the default GUI interface (LUNA, Aero, or Windows 8 Power Shell), instead of experimenting with alternatives.
Microsoft WINDOWS is like McDonald’s.  Or maybe the Ford Motor Company, while Henry Ford himself was still in-charge.  A Big-Mac is a Big-Mac is a Big-Mac.  “A customer can have a car painted any color he wants, as long as it’s black.”  (Henry Ford is supposed to have said this, circa 1923 or so, referring to the then on-going production run of the model-T.)
Or at least most of us seem to treat WINDOWS this way.
Linux is way more like Burger King.  Or Baskin Robbins.  Linux has 31-flavors (and then some!).   And you can have it “your way”.  Or you can just use the default configuration.  In Linux, it’s a lot more “up-to-you”:  a Linux-based desktop operating system for your personal computer is more customizable, and generally gives the owner/user the possibility of much more * contreol * over the system, and how it behaves.
WINDOWS is like checkers, or backgammon:  every person out there, just about, can learn to use it, and with pretty descent results.  (Until and/or unless it crashes or gets virused—though I will say that by the latter XP-era, ms Windows seemed to have become very stable, and real viruses seem to have abated in favor of malware/spyware/phishing.)
Linux, on the other hand, is more like Chess.  Anybody * can * learn to use it, and you see that plenty of people have.  But if you * don’t * want * to * learn *, you’re gonna have a harder time.
Still, there will be some folks out there who are readin’ this, who * need * to learn Linux, like for * work *.  More corporations and large hospitals than ever  are now switching to * nix-type systems—Linux and its “cousins”—UNIX, BSD, RHEL, Rocks Cluster and Solaris.  All of these use a UNIX-type file-system.  Indeed, we might just as well say, “they * are * UNIX, ‘under the hood’ “.  More-or-less.  Each one has some tweaks and unique features.  But the actual file-system-structures of those I have just named are essentially * UNIX *, and we can add to that list Apple/MAC-OSX.
Linux will let you download (free, remember!) pretty-much any one of 14 or so desktop-environments, thousands of wallpapers (or make your own, easily, with GIMP!); and there are over 30,000 free applications programs (“apps”,  “software packages”) to choose from.
With most Linux distros (“distro” means a “build”, like there are Widows XP and WINDOWS Vista—so there is Ubuntu Linux, Puppy Linux, KNOPPIX Linux, Fedora Linux, &tc.) you can use one of several desktop-environments.

For Linux, there are the two “heavyweights”—the GNOME desktop environment, and the KDE desktop environment.  KDE is the Cadillac, and GNOME is the Oldsmobile.      KDE is the Mercedes Benz, and GNOME (2.x–”classic”) is the mini-van.  KDE is the de-luxe one, and historically has required the most system-resources and processing-power.  There is now a lighter-weight version of it, which can be downloaded.  GNOME has been more like a middle-weight stand-by, and is the default DE of a fair number of Linux-based rescue discs (to “rescue” a troubled WINDOWS computer!), and other software-tools.  At least where it comes to Linux rescue-wares that even have much of a graphical environment to begin with.  Or rather this has been the case;  “classic” versions of GNOME may continue to be the default graphical environment of Linux live tools like Parted Magic or Helix live cd.  The major desktop distros, however, are in the act of switching to the new, “radical” version of GNOME (GNOME 3), or departing from the KDE-or-GNOME paradigm completely, to go with something completely new.  Like for example the LXDE-project (Lightweight X11 Desktop Environment).  At least this is what is taking-place in the desktop Linux meta-verse, at the time I am typing this.    The reason is simple:  technology (including graphical display software) continues to develop, and this we cannot wish-away.  And hardware continues to move-along, too.  Not many had heard of a “tablet computer” in 2001 (though they existed, but had not “come into their own”).  Or, for that matter, a “netbook”.
Then there are also “lighter weight” desktop environments, which you can use with your desktop Linux operating system.  There is Fluxbox, XFCE, iceWM, JWM + ROX, Openbox, Blackbox, BusyBox.  And there is Enlightenment E-17, which isn’t a “light-weight”, but a full-service DE with very different point-and-click controls.  LXDE is in the “lightweight” category.  Some of these DE s are technically more in the realm of * windowing-managers *,  than full Desktop Environments:  but for our purposes we can speak of them as if they were actual DE s, because in some Linux distros, they basically fulfill that role, by  default.
There are more, but I esteem these the more common, prolific ones.  All, of course, can be downloaded and used freely.  Everything in Linux is free-of-charge.  Or if it is not, there seems always to be a free-version (of the app, standalone, or whatever it is).  Free for ordinary home users like us.  Free for us kids who just want to run Linux on our netbook or moby device.  But if you want Linux for business use (beyond, say, running a small business—i.e. you’re the Social Security Administration, or the U.S. Air Force, or you own a large aircraft company), you will probably want one of the pay-for/”subscription”  versions, that can handle a huge data-base, and do very sophisticated cad-cam, and speed through very complex calculations—involving, say, the Mandelbrot set, or predicting the trajectories of sub-atomic particles.  Some Linux like perhaps RHEL, or maybe Darwin—though Darwin is actually Linux, but rather more a part of the extended-* nix-family.  (“*nix” = “UNIX-like”).
For the rest of us, there’s ordinary, desktop Linux.  And it’s given-away for free.
Where it comes to these “lighter-weight” DE s—like especially Fluxbox, BlackBox, Openbox–often there is no “Start” button or its equivalent, the way it is set-up by default in most of the distros that use it.  Instead, you just need to RIGHT-click, anywhere in empty space, on the desktop interface.  I guess this has kind of a Zen feel to it, because you’re “clicking on nothing, in order to click on something”.  It brings-up a context-menu/”pop-up menu”, and you do what you want from there.  If you have apps or windows open, and don’t have neutral screen space to click-on, look for a tiny open-patch in the lower-left corner of your screen, which these DE s often reserve for this purpose.  Or use your desktop-switch.  Or just minimize stuff.
Myself, where it comes to desktop choices—whether we’re talking about Ubuntu’s new Unity, or GNOME’s version 3, or any of the rest, I find I am able to adapt myself, with really very little consternation.  Play with it for a morning—with the aid of some of the free online instructions—you’ll get the hang of it.  It just isn’t that hard.  And the new Desktop Environs (GNOME 3, KDE 4, Unity) * are *, arguably, better to use on a tablet with a small screen, or any device with limited “screen-realestate”.  And I probly oughtta point-out here that * whichever * Desktop Environment your Linux distro is loading, it probably won’t ask you for any manual configuration or for info about the “screen resolution”.  (But Puppy Linux usually does—though the prompts are easy enough for even a novice to figure-out, and then make note of the info—though Puppy itself usually will also remember, if booted in persistence-mode.)  Most every DE in desktop Linux to-day will adjust itself “automagically”, to the size of the screen it is running-on.  At least as best as * that * particular DE is capable of doing.  (You * can * run Ubuntu 10.04’s GNOME 2.x on one of those early netbooks, with the 8.10” screen; but the control buttons are gonna be infinitesimally small, and it’ll be difficult, probaly, for a n00b user to change that:  this was the main reason for Ubuntu’s UNE/UNR launcher of circa 2009/2010—and Ubuntu’s recent and controversial switch to Unity DE as default graphical Desktop Environment).
It is a truth, though, that many of us who are used to doing serious work on a computer find these new desktop interfaces unpleasant to use, and find that they interfere with productivity and work-flow.  To many of us, GNOME 3, KDE 4 and Ubuntu’s new Unity DE are for the birds.  But there are, as my late uncle used to say, “more than one ways to skin a goose”.  If you’re one of those people who would rather just have * one * interface, and then stick with it, I think my recommendation would be to just download and install iceWM (or maybe XFCE), and just start using it.  If you pick iceWM, I’d be sure and also install the ROX-filer files-manager package, if I wuz you.  Even if you don’t set ROX-filer as your default files-manager (I prefer Nautilus), just having ROX-filer in the system will often help iceWM function properly.
Note that almost all modern desktop Linux distros will in some way present you with the option to use any one of whichever desktop environs are installed to your system, at the log-on screen.  Usually there will be a symbol shaped somewhat like a gear or cog, that perhaps was not evident before you installed iceWM, XFCE, or whatever.  So you don’t un-install the existing desktop environment (GNOME 3, Unity, etc.); you just download and install one (or more) complementary DE s (if you desire), and use whichever one you pick when you log-on.
What I like about iceWM is that, well, it offers stability and familiarity.  IceWM will run on most any desktop Linux distro.  Try installing it from your distro’s default repos first, before you try any other means (i.e. try a build that’s been approved by your distro’s makers first).  IceWM gives you a Windows 95/98-like interface, which will run just as well on a multi-user Linux (like Ubuntu, Debian, Pinguy—most of the rest) or on a single-user Linux (comparatively rare), such as Puppy Linux.  IceWM is a small project compared to the big, historic Linux DE s (like KDE and GNOME), but it is still in a current working state, and is maintained.
Unfortunately for us point-and-click-oriented, “I just wanna plug it and play it” folks, IceWM’s various graphical configuration tools are no longer maintained.  One can try downloading (free of course) and using one of ‘em, and it might work for ya.  But it is really pretty easy to customize the config of iceWM by editing its text-files.  I’d have to say it’s among the easiest of all Linux programs to configure/customize, that does not have current graphical config tools.  And you learn the basics of editing Linux program text-files in the bargain.  Or just use iceWM with its default settings—which is probly what most people do.
IceWM has a Windows-like “Start” button, located in the same place—in the lower-left of the screen.  This has not changed in all the years of iceWM’s being kept current, in terms of it’s source-code being maintained so it will continue to be able to work on the newer releases of Ubuntu, Puppy, PCLinuxOS, or whatever.  Nor is it likely to change.  From the user’s point-of-view, the iceWM desktop environment/window-manager works basically the same as it did in, say, 2007.  Stable, functional, rather plain, familiar.  And to add to iceWM’s virtues, it is light in weight.  So iceWM is likely to do well on a relatively low-powered device—netbook, tablet, or that old notebook your niece handed-off to you, because she got a new laptop for Christmas, and her existing computer was just too good to throw-away.

Yes, Linux desktop distros can be accused of having a “toy-like” look, where it comes to their GUI interfaces. Especially compared to the default DE Microsoft provides with Windows 7 and 8. But you’ll be amazed at what these “toys” can do!
RIGHT-click on everything in your Linux desktop—icons, links, other objects—esp. In Ubuntu:  in modern desktop Linux, you will find some very useful context-menus (“pop-up menus”) this way.  Take some time to play-around with your desktop.  First hover the mouse-pointer on stuff.  Then try a RIGHT-click.
Remember that Linux receives some of its funding from South African billionaire Mark Shuttleworth, who practices philanthropy.  Much of the rest comes from donations.  Of money, or of the man-hours donated by programmers.
An extra feature of Linux is that your Desktop Environment is duplicated—at least once, or several times.  A term for this might be “multiple work-spaces”.  Sorry this may seem a bit confusing to explain, but it’s a nice feature.  What it amounts to is that your Desktop working environment will have at least one “twin”, which you can access easily—usually by clicking on one of those rectangular boxes you see in the bottom bar of the screen.  Some distros come with six available.  In some distros, if you just place the mouse pointer anywhere on the desktop screen, and then scroll the wheel in either direction, you are taken through the other desktops, which look just the same, but on which you can have different applications open.  Power users (and even those who aren’t) use this feature so that they can have different apps open in different desktops at the same time, and other “productivity-related” manouvres.  This feature is also had in Apple/MAC.  In KNOPPIX 6.x, and/or where the Linux COMPIZ-CUBE is part of the system, you can change workspaces by rotating the cube;  this can be done from the mouse, or with a keyboard-command.
Of course you can have different apps open on the same desktop at the same time, just like in WINDOWS.  But there might be a time when you want to do more (assuming you have sufficient RAM).  As RAM keeps getting cheaper anyway, more and more pcs are coming with enough RAM to do a lot more things, and without an upgrade.  Depends on who made it.  Look at your specs.  Anyway, you don’t have to use this “workspaces” feature if you don’t want to.
There is also another nice, related feature, in most distros.   In Ubuntu (and most others), there is a tiny “icon” that normally sits in the band along the screen’s bottom.  It would be on the left side, way over at bottom-LEFT.  It looks like a little dark-colored square, with two tiny white dots on the left side of it.  It’s hard to see.  This is a switch, and its purpose is that, if you click it, it will “automatically minimize all WINDOWS/shells”, and so any open apps will get minimized to the bottom-bar (system tray).  (Windows (7) also has this feature—it’s a small rectangle that stands on its end, at the right end of the systray.)  This can be handy, if you want to access some object or program for which you created a desktop icon, or something that you have placed in the “Desktop” directory, without going to the menus.
For some more recent info on the Linux Desktop Environment upheaval, see the link:
http://www.linuxjournal.com/content/linus-ditches-kde-and-GNOME-so-what

and…

http://www.renewablepcs.com/about-linux/kde-gnome-or-xfce
27. Entry 27:  OTHER “CONFUSABLES”:
There are some other terms which are often inter-changed, or which have more than one meaning.
DESKTOP and DESKTOP:
As we have said, “a desktop computer” is a (somewhat antiquated) term for a computer that is small enough to sit on-top of a person’s desk.
A * Desktop Environment * is the graphical, point-and-click human interface that “rides on-top of the ‘guts’ of your computer operating system—whether it is WINDOWS or Linux, or MAC.  Traditional Linux desktops (desktop environments) are more oriented toward * productivity *; the newer Linux Desktop Environs (such as Ubuntu Netbook Remix interface, or perhaps GNOME 3x) are oriented more towards smaller hardware platforms, like netbooks, tablets, and smartphones.  GNOME and KDE have been the two “major” Linux Desktop Environments (GUIs) from which to operate Linux; but now that is changing.  New DE s are on the horizon.  You might note that, above, say, Ubuntu 9.04 or so (like after about 2009/2010), the Desktop Environment of most Linux distros will (usually) recognize the size of your screen and its proper “resolution” automatically, as it is loading.  So you don’t have to worry so much about whether you’re booting on a tablet, netbook, or full-size monitor:  most of the popular DE s have had some coding added, to enable “automagic” recognition of what size screen they’re booting on.  What you ** do ** still need to worry about is such things like “do I have an nVidia-brand graphics card that’s really new?”, or “the letters and buttons on my screen are so tiny, I can’t read them while using the GNOMR 2.x DE on my 8” netbook”.  But there are programs to deal with al of this—stuff like GNOME 3—which maybe * is * better for a really small screen (?), or nVidia support drivers from the web (be careful, though, and read all the documentation).
FORMAT:
This is both a noun and a verb.  See the entry pertaining to “FILES-SYSTEM AND FILES-FORMATTING”.  Installing Linux Mint to the internal harddrive of a computer is an example of use as a verb:  “I * formatted * John’s computer to Linux Mint yesterday” (verb).
Then there is a format paradigm (noun), which is the files allocation scheme you impose upon any drive (“disk”) when you press that last button in G-Parted or Linux Mint Installer, or Hewlett Packard free formatting tool, or whatever program, to change the files allocation structure of the drive (harddrive, USB thumb-drive, etc.) in some way.  If I feel insecure having my personal files backed-up to an 8 Gb thumb-drive that came with the usual (at the time of this writing) vFAT 32 files allocation format, I can go to My Computer (in WINDOWS 7) and change that to NTFS, if I want.  This will erase all the files, though, so I have to copy them to a folder first.  Then I can re-copy them to the newly re-formatted thumb-drive, when I have changed it over to NTFS format.

Synaptic/Synaptics:
The first is a Package Manager popular in many Debian-based Linux distros; the second (with an “s” on the end), is a touchpad/trackpad built-in mouse for laptops.
IP and IP:
This can mean either your computer’s * Internet Protocol *–as in your IP-address; or it can mean “Intellectual Property”.  You just have to know from context.
FLASH AND FLASH:
noun and verb:  as in a) “Flash video”; b) “I need to re-flash my BIOS”.
There is also “Flash Memory”, as a USB thumb-drive or Jump-drive is also often called a “Flash-drive”.  And there  are about 40 other names for the thumb-sized memory device.
Flash—as in * Adobe Flash * is a noun, in that it is effectively a type of file format.
The act of updating the firmware program in your computer’s BIOS (Basic Input – Output System—the sort of “master chip” through which ms WINDOWS or Ubuntu, or any operating system must be passed—from the os’s permanent home and “bedroom” in the harddrive—where the operating system(s) sleep when the machine is shutdown cold—and into the RAM chips, which is where all programs “live”, when they are activated by you):  updating BIOS firmware is said to be the act of “re-flashing” the BIOS.  Therefore in this context, it is a * verb *.
ANDROID and Android :
The first is an operating system for phones and tablets built by the Google corporation, and which runs on-top of a Linux kernel (!).  This operating system is free to download.  The second is the Android network controller/network card, which many computers use to manage network connectivity.  The Android network card is built by Samsung.
software stack/file-system/operating system:    For our purposes, these can be used interchangably; but operating system or “os” is probly the preferred.
“BOOTLOADER” AND “CHAIN-LOADER”:
“Chain-loading” probably refers more properly to the situation of having more than two operating systems that can be booted from one’s harddrive—at least one of which has its own boot-loader.  This is often refered-to as a multi-boot arrangement.  So if you have a laptop that came from the store with Windows 7, and you then installed Ubuntu 10.04 to its harddrive, you then get the GRUB-2 program with Ubuntu (GNU Grub 1.98), and it will remain “associated’ with Ubuntu, though it will hand-off the starting boot-process to Windows, any time you use the down/up arrow keys to select Windows when you turn the computer-on from a cold state (meaning it was turned completely off, which you should do every night anyway—or more often).
You do not have to install Linux to your harddrive to-day, or make any permanent changes at all, to use Linux; Linux to-day can be run from a USB thumb-key, an external USB add-on harddrive, or other media.  More about this later.
But anyway, if Ubuntu gets installed to your machine’s internal harddrive, then its Installer program partitions the harddrive for you, and prompts you for a few questions.  Ubuntu will reserve less space for itself than it leaves for Windows, by default.  This can be changed later, if you want.
And it installs the bulk of the Grub program (GRand Universal Bootloader) onto the partition it creates for itself—for Ubuntu, while a very small part of Grub gets installed to your machine’s MBR (Master Boot Record).  This latter is called “Grub stage 1”.  Grub stage_1 then takes the place of Windows’ own bootloader’s MBR boot-code, and hands-off the boot process to Grub stage 2 at bootup.  (This over-writing of the MBR, by the way, can be un-done, restored—and easier than you’d think.  A person who is reasonably competent with technology, and who is not distracted/can just pay good attention to what he-or-she is doing and read the instructions, can probably mess with all this, and be successful.)
Grub stage 2 is where the heart of Grub is, and this is what will present you with a plain menu, as to which operating system you wish to boot—Windows or Linux.  This arrangement seems to work fine most of the time, but can be a big headache on some machines.  That’s why I tend to think in terms of using Linux from a USB thumb-key:  you don’t have to touch your harddrive.  [Another worthy option is an external, add-on harddrive, or a micro-drive (mini-external-harddrive)].  And some people like to just run their desktop Linux from a cd or DVD.  But you should research it some, if you’re gonna do this latter as a long-term way of running; many desktop Linux—while capable of running this way—are really built to be installed to some “traditional”-type harddrive, with those cute little spinning platters inside it.
Anyway, once I got to dual-booting [booting into either Windows or Ubuntu (or some similar Linux I installed), and everything checked-out okay:  I could then boot into Linux sometime and use a program called G-Parted (Gnu PARTition EDitor) to shrink my Ubuntu partition, and create an empty (small) section (partition) on my harddrive, and then format it to, say .ext2 (a common Linux format); and then I could use, say, a Puppy Linux live-cd to load Puppy onto that space—EVEN INCLUDING PUPPY’S OWN BOOTLOADER.
Then I could use Ubuntu’s Terminal to open the proper Grub configuration file in, say, the Nano CLI-text-editor, and edit the file(s) so that Grub would then “see” Puppy’s own boot-loader (or just boot into Ubuntu and run “sudo update-grub”):  and, if prompted correctly during a cold-boot of the machine, hand-off the boot-process to Puppy’s loader, by just selecting “Puppy” from the Grub-boot menu that appears.  Which in-turn would boot Puppy, if I wanted.  Hence “chain-loading”.
As you can see, this is a multi-layered scenario, and furthermore, there are multiple ways to do this “multi-boot” type of thing.  The possible combinations are almost endless. But what I have just described constitutes, more-or-less, a scenario of ‘chain-loading”, because Ubuntu’s own bootloader (Grub, which will be the first one in the “chain”) is set to “hand-off” the beginning boot * to * * ANOTHER * * bootloader * (Puppy’s).  At least if you prompt the setup to do so in the first few seconds of a boot.  Otherwise, the machine will boot to whatever you have set as the default operating system.  If Grub is one of the first “links” in the “chain”—as depicted here, then the “default os” is probly gonna be whatever is the FIRST entry shown in GRUB’s menu screen.  Grub always tries to boot the top entry first, whatever it may be.  This can be changed, too, and usually without a huge hassle.  You can keep WINDOWS at the top if you desire, and have the other operating systems available, further-down in the Grub menu.
That is * chain-loading *, proper, * as * I * understand * it *.  But as I’ve indicated, the term is sometimes used with rather reckless abandon, as are at least a few other computing-terms.
But this is deeper water than I intend to wade-into here.  And this document has much more to do with much simpler (Thank God ! ) ways to start using Linux.  If you’re still with me, read-on.
DEV and DEV :
This can be an abbreviation for either “developer”, as in a “software developer” (i.e. a person who writes software applications, which we used to call a “computer programmer”); OR it can mean a “device”, such as a harddrive, or a certain part of a harddrive, or a peripheral device, such as a printer or a USB-mouse.
PORTABLE APP / PORTABLE SOFTWARE / PORTABLE confused (somewhat) with STANDALONE :
A standalone is a program that is capable of executing (running) by itself, without having to be installed to an operating system (such as Ubuntu or Windows).  A portable app can be the same thing; however, sometimes “portable” is used to mean that the app can run on any of more than one operating system—the other (and perhaps more proper) way to describe this is “cross-platform”.
IDE and IDE:
One is a type of harddrive; in another context, IDE can mean “Integrated Development Environment”:  many programmers actually write their programs for WINDOWS on a Linux system, which is equipped with something like the Kile program, and so their Linux system is an “Integrated Development Environment” for writing programs.

28. Entry 28:  MY “BIG THREE”:    To my mind, there are three Linux distros (distributions: distribution/”distro” = a build, like there are XP, WINDOWS Me, Vista, and Win 7 in the WINDOWS metaverse) which are the best for us newbie, tender-foot, non-techy types who just want to * use * the stoopid computer.  I esteem these as 1) Puppy Linux (which, besides its official builds, has many “remixes”, mostly home-built); 2) Ubuntu Linux, which again has many knock-off “variants”, which are mostly not home-made, but produced most often by groups of like-minded developers; and 3) microKNOPPIX, which just means the cd-version of KNOPPIX, as opposed to the full-blown, mega-size, DVD-version.
Ubuntu has very good documentation (free instructions available online).  The way things are headed, I think I’d also recommend Linux Mint, PinguyOS, and/or KNOPPIX DVD, as possible desktops for a noob to start-out with.  Why? For one thing at least, the bigger, more highly-processed Linux desktop distros that are too big to fit on a cd (so you burn ‘em to DVD)—well, these “big boys” are often better at recognizing your critical hardware—networking cards, graphics/usable display on the screen, audio.  The DVD-oriented distros have extra coding, and so often contain drivers and related stuff which the distros that fit on a cd don’t, or which could not be included by the makers, because of redistribution-licensing issues.  But a distributed-community that publishes some desktop Linux distro doesn’t seem to have this problem:  the Mint community, for instance, always seems to include a lot more drivers and codecs in their .iso image, by  default.  Without the three big hardware components—graphics, networking and audio—your computer isn’t much good to you.  And these things can be hard to get going, where the distro did not recognize them Out-Of-The-Box.
Peripheral hardware (like printers, mp3-players, etc.) can be tweaked later or traded-off.  But it can be mighty hard to change a graphics-card in your laptop and install one that has better Linux support—especially if you’ve only done this once or twice.  And it is often easier to deal with some (relatively) minor changes that the makers of, say, an Ubuntu re-hack, may have made to the names of a few folders/directories in the file-system, than to (as a newbie) attempt to diagnose a problem with your network-stack.
I will note here that if you can’t find an instruction as to how to do something with Ubuntu 10.04 (“Lucid Lynx”), any instruction found for Ubuntu 9.10 (“Karmic Koala”) will very often work.  See other files pertaining to Ubuntu, which I intend to post here.
If you are gonna use Ubuntu—especially for productivity/home office—you’d  better be okay with remembering and using a some work-arounds.  Especially if you’re used to productivity on a good stable WINDOWS install that nobody uses for anything else.  And especially if it’s gonna be on a laptop, employing the traditional “dual-boot” arrangement, where Ubuntu gets installed to a partition on the same drive where WINDOWS already lives.  A good link to help you find a distro appropriate for a certain laptop, and with tips on how to make it actually work is  http://www.linux-laptop.net/ .  A better way might be to experiment with running Linux from a USB thumb-key first.  There are several graphical programs that help you do this, available online for free.  Ubuntu even comes with its own—Startup Disk Creator—available to you from the menus—even in the live-cd sessions!  But I tend to prefer running desktop Linux from an external USB harddrive.  You just BIOS-boot it—just like you would a live-cd of Linux.  That would be to say, after you have installed Linux to the external drive.  For this scenario, USB thumb-drive creators often are not the programs to start-out with;  I find it much better to just boot my live DVD or cd, and then use its own installer to put the distro onto the external harddrive.  You gotta do this with some care, though.  So maybe it’s a questionable strategy for a raw noobie.  If you do it, the key things are * 1) understand how to use the tiny arrow for the drop-down menu in the installer, or similar installer-menu options, that let you select * WHICH * harddrive you’re “pointing-at”; * 2) where you’re gonna install to an EXTERNAL harddrive, you need to * be sure * you can get the program to install Grub * to * the * EXTERNAL * drive *—and not to the machine’s ordinary, internal harddrive.  This latter often involves use of the installer’s Advanced tab.  But take heart:  this is a lot easier than it sounds, and you are often presented with very few options to choose from anyway, so it’s harder to screw-up.
If you install desktop Linux to an external harddrive, but accidentally let its Grub install to your internal harddrive (especially if the Windows install you don’t wanna hurt is in there), then you’re gonna be in a pickle, and the only way to get Windows back will be to 1) do a restore of MBR (with your WINDOWS rescue-disk, or perhaps with a live Linux rescue disc, like maybe KNOPPIX or Helix), or UBCD (dos-based); or 2) do a complete recovery/reformat of Windows.
As with any of this, you need to do (at least some of) your own research.  I just intended this document as a general guide, to try and get * concepts * clear.  And to try and help others with the benefit of my (limited) experience.  If you don’t feel comfotable doing something, don’t do it.  Or find some (old) equipment you don’t care about.
All operating systems have their quirks.  Ubuntu certainly has its share.  [The LTS releases seem better, though, than the ones that are only supported for six months.]  I will have to add, however, that between 1) the Ubuntu/Cannonical Ltd. entity itself, and 2) the distributed user-community, these issues seem to get fixed more readily, or a fix is published online—which can be found by Googling.  Early downloads of Ubuntu 9.10—even the “final release” (which I guess means no-longer in Beta)—could be very buggy; however, by the time I downloaded it (which was near the end of its software-support cycle), most of the issues seem to have been fixed.  My understanding of Ubuntu’s distribution process is roughly that the distro’s .iso is initially made available for download in Alpha 1, then Alpha 2, then Beta (1, 2, and perhaps even 3 and Beta 4), and then is released as “final release”.  (See the entry as to “Alpha and Beta”.)  Once it reaches “final”, the .iso that is posted for download is no longer altered or improved.
At least that is my understanding.  Further improvements/bug-fixes are made by way of software updates, which are received over the web.  Which is fine, maybe even if you don’t have broadband, and are stuck with dial-up.  Because you could probably just click the updates manager when it pops-up, and let it run in the background, or else close it, and then re-open it before bed, letting the machine do its thing over-night.  You could then usually apply the changes in the morning, or continue to let the Update Manager run in the background, if it paused because it needed to ask you a question, which happens once in a while.  Once the Update Manager has been run a few times, the “stickiest” updates are usually out of the way for awhile, and it will usually just want to download small numbers of updates at a time (say, between 4 and 10) after that. And it will stop bothering you frequently, and just want to run every 2-3 days or so.
But I have a sneakin’ suspicion that Ubuntu’s management is changing the final .iso of certain releases during the life-cycle, which, if true, would be an advantage for us users, because some important bug-fixes would then get rolled-in to future downloads, and already be present when somebody downloaded, say, 9.10 or 11.04 some months after it was initially released.  (I know the LTS versions do this, because if you look at the .iso filename of Ubuntu 10.04 in Windows, it will say “ubuntu 10.04.2.iso”.  At least if you downloaded it in the second year or so of its life-cycle.  That’s what that extra digit on the end seems to mean.)  (Most recently, I downloaded 10.04 as “ubuntu 10.04.4.iso”).  And so you would not have to rely on your network connection to get some fixes that might be important, and which came along later, after the initial final release.  Really, it does not matter much either way:  a lot of coffee houses will let you sit in there all day, for the price of a cheeseburger.  And public libraries often have free wifi.  So once you have downloaded it and set it up, the waiting updates will come over the internet, and get installed when you click “install updates”.  If there are a lot of updates waiting, however, I seem to notice it can take more than one use-session to get ’em all.
Where it comes to Ubuntu, I think my strategy would be to first try using the LTS version that is in current-release, whatever that is by the time you read this.  I’m no authority, but Ubuntu’s LTS versions seem to be more stable, and less buggy than the 6-month cycle releases in-between.  The versions of the bundled apps that come with the LTS Ubuntus may be older ones, that are somewhat dated by the time the developers could put the LTS together, but it’s a trade-off.
Quite frankly, the apps that make up the software suite of just about any release of Ubuntu (and many of its variants) are often not the most current, but rather are a couple of versions older.  This has to do with tried-and-true stability, and with the sheer difficulty of assembling a modern operating system of any type.  There are just so many programs that must all work together, and many other complexities.  If you really need a newer version of some app before the next release of Ubuntu comes-out, there are often ways to do it:  but you may just have to compile the program from source-code.
This does not mean sitting down to a very long table somewhere and poring-over a roll of white paper with line-after-line of 1’s and 0’s printed on it, until you have re-written the binary code.  Those days vanished with T-Rex and Iguanadon.  Today we use a * compiler *, and it’s built-in to the system.  In most cases, there are three easy steps to compiling a program from source in modern Linux and each is automated—so the system actually * compiles * it * for * you *.  All you have to learn (usually) is how to tell it when and where.  Which means you’ve gotta know just a little about the UNIX file system.  But not much, really.  Hey, you’ve probably de-compressed .zip files in Windows, and installed * them *.  Compiling from source in to-day’s Linux is often no harder.  Just take time to read the instructions.  If you don’t have the time, you may be able to get somebody who knows to help you.  Look around for a LUG (Linux User’s Group) you could join, even if you haven’t booted your first Linux yet.  It can be rewarding.
In any case, you don’t have to install it.  You can run Ubuntu (and other distros) from a USB thumb, or you can run it from a partition in an external harddrive (now my favorite way).  If you have enough money to buy an external harddrive—and especially just for the purpose of dedicating it to desktop Linux—this seems like a good option.  It will give you the opportunity to experiment, and to learn to do stuff like use G-Parted and/or Parted Magic (its own small graphical Linux that runs from a disc), and to wipe-out a Linux install that didn’t work-out, and try something else.  All without messing with the disk to which your WINDOWS is installed, which would be a traditional dual-boot arrangement.  And the traditional dual-boot arrangement often seems to result in Linux trying to run “with a headache”, and so it doesn’t do so well.  Whereas running desktop Linux from a harddisk it does not share with WINDOWS seems to cause it to do better.  That is why some people have just selected “use entire disk” when installing desktop Linux to an INTERNAL  harddrive:  but I do not recommend this, as it will over-write WINDOWS, and WINDOWS  may still come in handy as a fall-back position for awhile.  Messing with desktop Linux on an * EXTERNAL *, dedicated harddrive is safer for your WINDOWS install—provided you do not accidentally run the partitioner-program (like G-Parted) or an installer [like Ubuntu’s installer (code-named “Ubiquity”)] in your computer’s INTERNAL harddrive, or otherwise accidentally make some permanent change to your machine’s INTERNAL harddrive.
ANY TIME YOU ARE ABOUT TO INSTALL SOMETHING LIKE AN ** OPERATING SYSTEM **, YOU SHOULD HAVE EVERYTHING ON THE EQUIPMENT IN-QUESTION WELL BACKED-UP—YOUR WEDDING PICTURES, VIDEOS OF YOUR KIDS, YOUR TAX RETURNS, OLD LOVE-LETTERS, * EVERYTHING ELSE *, AND OF COURSE THE * WINDOWS * OPERATING * SYSTEM * ITSELF.  Backup to cds, backup to DVDs.  Keep them in a secure location.  And then backup to a reputable cloud service.  Many of these will offer you several Giga-bytes for free.  Ubuntu One will give you 5 Gb. for free, and will work fine from a WINDOWS computer as well.  The DropBox people give 2 Gb, and MS Live SkyDrive give you 7 Gb. free—though you gotta have a hotmail or some other type account with them.  But that’s also one you can open and have for free.  You don’t even have to use the hotmail account.  Or you can just make it into a spam-box (trash-can for the spam you’d otherwise receive in your real e-mail).
Then backup your personal files to a thumb-key/thumb keys, if you really wanna be right.  At least 3 layers of redundancy is what the professionals do.  * NO * layers of redundancy is what most people do.  Most people are like my neighbor Belinda, who lost her wedding pictures.  Not because she was messin’-about with Linux, but because XP decided to take a dump on her.  Seems she got this malware from a webpage.  Maybe she wasn’t so broken-up anyway.  She has been divorced for some time, and did not even invest (what funds might have been available to her) in a sevice-call to try and have them restored.  But I cite her example as a warning to the rest of us—especially as we are considering messin’ with a different operating system, which of course involves doing some formatting to a drive, once one is no-longer content with live-cd sessions.  So let’s be smarter than Belinda (not her real name), and let’s learn to backup everything before we indulge some procedure to alter the file-system of a drive—even a thumb-stick.  Remember:  We’re the * Linux * People.  We’re supposed to morph ourselves into something a cut-above the average Windows user!
Even if you end-up with the finding that “desktop Linux just isn’t workable for me and my situation right now”, you’ll still have the external harddrive to use as video and data-storeage, or you can sell it as ‘like-new” on Amazon or E-Bay, and perhaps get most of your money back.  It’s usually pretty hard to ruin one of these modern external harddrives by experimenting with desktop Linux, if you just watch what you’re doing.  NOT IMPOSSIBLE.  So take responsibility.  Be an adult, read the instructions and the release notes first.  If you don’t understand something, Google it until you do, or else be satisfied with taking the risk.  There will always be some risk to your hardware and your WINDOWS install, because nothing is perfect, and there is always the possibility of human-error.  Especially for us newbies.
The G-Parted program has a tiny little arrow in its upper-right area, that allows you to select a different disk to work-on.  G-Parted usually opens in your INTERNAL harddrive, by default.  It’s a similar situation with Ubuntu’s installer, and the installer utility of most distros.  Be careful of which drive you’re in.  If you have not clicked “apply changes” and then confirmed * that *, it is not too late to back-out, and go at things later (or never).
What exactly I did to the external hdd is detailed further-on.  Read on.
Still, if you just are doing a typical dual-boot install, where you are going to use the option to have the two os sit “side-by-side” on the internal harddrive, then you won’t probably have to even mess with G-Parted and so-forth.  In some relatively exceptional situation the installer still might ask you to, because not all installs of WINDOWS to a machine—even at the factory—are the same.  Some installations of ms WINDOWS have extra hidden partitions, boot-sectors, and various tricks—most of these aimed at recovery in case of a crash or a particularly bad virus.  Dell is often cited for this, but it is not unique to them.
Take note that if you continue to use any LTS (Long Term Support) build past the end of its “support cycle”, it will  STILL CONTINUE TO WORK.  It will just be a lot more difficult (for a noob) to keep it up-to-date, according to the usual standards.  However, some people * do * use it this way.  Further, remember that Ubuntu’s LTS releases each last for three (3) years (5 years in the case of Ubuntu 12.04), in terms of updates supplied via the web.  The clock starts from the day of official final release of the build, which Cannonical publicly announces.  This is in April, every two years, and then the build is supported for three years from then, out into the future.  So an LTS release (or any release, for that matter), is supported with software updates for functionality, bug-fixes, and importantly * security-fixes/the closing of discovered exploit-holes * for a specified length of time starting from the day of its mature (“final”) release—not from the day you decided to download or install.  This seems obvious, but it can easily slip one’s mind.
This applies, by the way, to pretty much any Linux distro of any kind—whether LTS or not:  You can continue to use it past the end of its life-cycle, and try to update important softwares manually, or else not update anything.  Many Linux users out there just ignore Update Manager when it appears, and use the same build for many years after it is no longer officially supported.  Because the Upgrade to the next release version sometimes inexplicably fails, and one must then reformat by using the new release, which can be a headache with several days of downtime, while you fight with the machine.  And if you forgot to backup some personal files, they could be lost.  That’s why I like the LTS builds—they have 3 years of life-cycle (and the latest LTS has 5)—at least from the day of their release by the makers.
If you use Linux without updates, it’ll be a greater risk for you of the little bit of Linux malware that does exist out there.  But, at the time I write this at least [late 2012], this seems niggling.  Most runs of an updates manager in Linux do not require you to restart/reboot, either.  This has pretty much always been the case.  Because Linux writes-to-disk differently, most times when you run an update manager, it does not require you to re-boot.
What is more-to-the-point is that your APPS will no longer be kept current, if you’re going to keep using a release past its EOL (End Of Life).  So the versions you have of, say, Totem media player, will get older and older, just by remaining the same, while the rest of the software metaverse continues to develop.  When your apps get outdated enough, they will begin not working with the internet, or with files you have downloaded.  This could take * years *, though.  Depends on what files-formats, and the relative pace of technology, your personal tastes, &tc.  When this gets bad enough, you will be almost forced to Upgrade to a newer operating system, whether it is Linux or some other thing, notwithstanding.  It’s the same trip for WINDOWS users; Microsoft is discontinuing support for XP, and in a few years, many of its softwares will no longer work with content from the web, or will have stopped working because of ordinary files-corruption, or other reasons.  But the repair files will no longer be available for download from maker’s sites, so people will have to go buy a WINDOWS 7 disc, or, more likely, buy a newer computer which has the CPU and RAM “oomph” necessary to run Windows 7 properly.
With Linux, you can more often just download and start using a newer version of your distro, or perhaps find another distro you like.  It is less likely you will have to invest in newer hardware—though eventually even this will become necessary.  Because you can’t expect the world to stand still, or to hold-on to your beloved laptop forever.
So I guess my point here is that it might be better (for us noobs) to wait a little while, until a release has been out in “final” for awhile before trying it.  At least where Ubuntu is concerned.  This is probably good advice where it comes to any Linux distro (at least for newbies).  So let’s pick a build that has been in final release for at least a little while, before trying it as a live-cd.
I guess my more over-arching point here is that releases of Ubuntu can be buggy.  PARTICULARLY THE “TWEENERS”—by which I mean those releases that aren’t affixed with the designation “LTS” (“Long Term Support”).  (Hey, why don’t you use WINDOWS VISTA?  But Ubuntu does not belong in that company—Ubuntu and its variants are better than that.)  And I speak here of pre-Unity DE versions of Ubuntu (like 9.04, 9.10, 10.10).  If you’re a newbie (and that’s what this blog is about), I guess you could try the standard new versions of Ubuntu (11.04, 11.10, or 12.04 LTS), which come with the new Unity DE (Desktop Environment).  [UPDATE:  the new LTS (Long Term Support – used to be 36 months, now 60 months of updates-support) version of Ubuntu has been released now, and indications are that the difficulties with the new Unity desktop environ have been fixed.  At least as much as * can * be, in terms of productivity and work-flow.  You may keep reading from here, if you’re that interested in the Ubuntu “Unity Desktop controversy”; otherwise, it might be just as well for you to just page-down to MY BIG THREE, continued, and pick-up from there.  –L.L.]  Unity might work-out for you just fine.  “Traditional” GNOME (GNOME 2x) is still offered as an option through Ubuntu 11.04, and there is a “GNOME Classic Fall-Back mode” in Ubuntu 12.04.
In Ubuntu 11.10, you get several desktop options, each of which is already in the system so you don’t have to download ‘em.  You can just use Unity (default), or, if your system is having trouble with the 3-D, or you’re maybe on a netbook or limited hardware platform, you can select Unity 2-D (two dimensional) from the menus.  You might note that Unity 2-D is Qt-based, rather than Gtk-based (this refers to the system-libraries used:  Google “Gtk+ versus Qt” if you want to know more, but really, I don’t find this something you really need to understand, just to migrate to desktop Linux).  If you don’t like either of these, you can select GNOME SHELL, which is also available, and I think there is a version of classic GNOME in there too (under “GNOME Fallback” ??).  And there is GNOME 3.  You can also select the option for “GNOME no visual effects” in this release.  Similar “no visual effects” options are available in various interations of the GNOME desktop, in various Linux distributions, by the way.  GNOME SHELL seems to be the new iteration of GNOME desktop, and is more like Ubuntu’s new Unity desktop environment than like the “classic” GNOME (i.e. GNOME 2.x).  Classic GNOME will probably still continue to be maintained—somewhere:  but the surviving small community of core GNOME enthusiasts probably will not be able to keep all of classic GNOME’s compatibility interfacing up-to-date with the new metaverse of apps that the newer DE s will spawn; so it will probably be increasingly the province of progressively obscure rescueware and software tools.
GNOME SHELL, on the other hand, seems like it will be the way-of-the-future for the GNOME project.  I guess we’ll just have to see how it all unfolds.  But one of the good things about Linux is that there’s always a good desktop option (even for us noobs), no matter what.  The key is finding the one that’s gonna be satisfactory for * YOU *.
What seems to be the case is that the new “netbook-like” desktop environs (that would be UNITY, GNOME SHELL, AND GNOME 3) are actually * EASIER * for those who are not “into” computers, but are not as good for SOHO (Small Office/Home Office) or for a high  degree of personalization/customization, as in making a large number of changes to how your desktop-interface will appear and work.  When it comes to computers, anybody can change wallpaper (themes) on dang-near ANYTHING—including any of the DE s I have named above or anywhere else in this article.  And without bein’ a computer-science major.  So you can change the basic look, no matter what DE you have.  When it comes to certain other personal (or productivity) settings, though, UNITY, GNOME SHELL, and GNOME 3 are apparently more limiting, in what they will let you easily change.  This does not seem to matter much to the average WINDOWS user, who just wants to do his or her homework, and have some fun on the computer—whether it’s a netbook, tablet, or full-size desktop model.  For these “social-users”, UNITY, GNOME SHELL, and GNOME 3 will almost certainly constitute an IMPROVEMENT.
For us SOHO/research computer users, well, it looks like we’re gonna have to do a little trial-and-error, perhaps, in order to find a Desktop Environ that’s as good for our needs as the now-being-phased-out GNOME 2-x series was.  “Progress”.
There’ll be a shake-out, just as there has been in the automotive and other markets.
One of the chief gripes among SOHO users in regards to Ubuntu’s new DE/user-interfaces, seems to be “there’s no taskbar!” in UNITY.  Well, my understanding is that you can easily install the Tint2 Panel, which is Ubuntu’s traditional choice for a Linux taskbar.  It’s graphical/point-n-click, comes with older versions of Ubuntu by default (like 10.04), and works very much like the taskbar in XP.  And I’ve heard that it functions just fine with the new Unity DE.  I guess we’ll find-out.
I’m sorry to be so frank here, and people are probly gonna flame me on this, but indications are that Ubuntu 11.04 and 11.10 (Oneric Ocelot) are even buggier than previous Ubuntu, OOTB.
Ubuntu 10.04 LTS is doesn’t seem very buggy at all, and I notice that I’m able to—with minor annoyance—use it on a daily basis for productivity (from the dual-boot arrangement on my internal harddrive; I’m still trying Ubuntu 10.04 from my * external * hdd, as I have time.  But so far the external-hdd install seems * bugless *.  And I’ll add that Karmic Koala is still running seemingly bug-lessly on my mom’s hp slimline tower—and this on the same disk as Vista—though neither of us has booted Vista in months.)  I know the few work-arounds I need, and use them when the system balks.  I have found these by trial-and-error, by Googling, and just by “intuition”.
Many people like Unity.  Many don’t.  I have described ways in here to tweak it for productivity.  This mostly means just installing the Tint2 panel, which gives you a traditional-style taskbar, which many people miss on machines larger than a small netbook.
If you get disgusted, I think maybe I’d just try downloading and using Xubuntu.  Maybe that would just be the go-to option in the first place.  Frankly, where it comes to Ubuntu, I think I’d go with 10.04 LTS.  10.04 LTS (“Lucid Lynx”)  will continue to be supported through April of 2013.  (And probably more like late May or June.)  If you use an LTS version, you will notice it has a 3-year life-cycle, and so can be kept up-to-date for three whole years (from the first day it is released by Cannonical, Ltd.—not from the day you * install * it).  Ubuntu offers a new LTS release every two (2) years, and then that LTS release is supported with frequent software updates for the next thirty-six months.  WINDOWS, by contrast, pops-up Windows Updater “when Microsoft feels it’s necessary”, and you are able to receive Windows updates “when microsoft feels they are needed”.  Even if you run Windows update manager more frequently, it will not list updates available for download until Microsoft makes them available in its servers.  Sometimes MS is fairly prompt about this, and many times it is not.  Especially when contrasted against the makers of Ubuntu (Cannonical, Ltd.), and the co-op authorities of other Linux distros, such as the KNOPPIX Project or the Debian Foundation.  This is said to be especially disordered, as WINDOWS is * that * much more vulnerable to malware, and heading-off malware is (arguably) the purpose of the preponderance of these network updates.
I will add right here that Ubuntu 10.04 Lucid Lynx boots and runs without issue on my netbook (an Acer Aspire One).  The control buttons on the various graphical interfaces were very small, as what you seem to get with Lucid Lynx on a netbook or other small screen is just a miniature version of what you would see on a full size monitor.
Some prospective tips (which somebody passed-on to me) for those who just wish to “try and tame the Unity DE in 11.04 or 11.10” (THIS PERTAINS TO FULL-SIZE PLATFORMS—not so much netbooks/tablets/phones):  1) The keyboard shortcuts available to navigate Unity make this a far more usable desktop.  A Google search should turn these up.  (Check-out “GNOME PIE”.)  2) Installing CCSM makes the dock far less annoying.  3) Reduce the size of the icons to 32 and it’s much less intrusive.  Source:  michael, May 15, 2011 at 8:53 am            http://desktoplinuxreviews.com/2011/05/01/ubuntu-11-04/comment-page-8/#comments
4) If using something larger than an 8-inch (204 mm or so) screen, install Tint2 panel, which you can probly do from your default repositories with a couple of mouse-clicks.
There are also guides online, as to how to tweak and set-up Unity.
10.04 LTS (Lucid Lynx) has a little bit of a “weird” interface; the buttons for minimize, maximize, and close are moved to the LEFT, for some reason, and are very small.  There are ways to change this, though.  (See http://www.youtube.com/watch?v=f2wFjPy-wAA&feature=relmfu )  But this might be more complex (for a newbie, anyway) than you’d think.  I have gotten used-to the default desktop interface, with the close/minimize/maximize buttons on the LEFT, and the fact that they are ROUND, and also very, very SMALL.  It just didn’t bother me; indeed, I think it has helped me develop the mental habit of a dynamic attitude toward graphical interfaces—something useful to my career, as I am called-on to try to fix different technology set-ups at work, and, as I never was a gamer, and I did not have a computer prior to three years ago, an unfamiliar graphical environment would often throw me.  But it might not be helpful to * YOU *.  There are other tacks one can take, however, in regard to Ubuntu 10.04’s new look (new in comparison to previous Ubuntu).
Some people, for instance, just download the UNE interface from Ubuntu’s repositories (now in PPA).
UNE interface is the interface from Ubuntu Netbook Edition (sometimes known as Ubuntu Netbook Remix, or UNR), which was an official variant from Cannonical, most of the features of which have just been incorporated into 10.04 Lucid Lynx:  so UNE is no longer maintained as a separate release (or it soon will not be).  But its desktop interface can still run on top of modern Ubuntu—probly including 11.04 and 11.10, as well as 10.10.  More than a few people appreciate this, as it presents you with nice, big icons for frequently done tasks, and has bigger buttons, which often makes it good for us older folks who have some vision problems.  Unity does too, by the way; but it’s still * Unity *, and some people just seem not to get-along with Unity.  Please note that I have not done this (installed UNE interface) myself, so I don’t know from firsthand experience if there are many catches, or not.  It took me many weeks just to get 10.04 Lucid workin’ on my good laptop here, just working on it in my spare time (though as I did not * have * much spare time, I did not spend that many man-hours at it.   So I guess it wasn’t really so bad, if you added it up).  A good link to have a look at in regard to this issue (recent desktop interfaces) would be:  http://www.omgubuntu.co.uk/2011/05/pre-unity-ubuntu-netbook-launcher-is-resurrected-put-in-a-ppa/
It is also true that there is an official release of Ubuntu with LXDE as the default desktop environ.  Not surprisingly, they named it Lubuntu.  (LXDE + Ubuntu = “Lubuntu”).  LXDE is another light-weight desktop that has been developed by a group of like-minded programmers to run on-top of Linux and its window-managers.  LXDE is a recent project.  It will auto-adjust to pretty much any screen size—laptop, desktop monitor, big screen tv, netbook, tablet, etc.  If the screen-resolution is way off, you will have to try a re-boot.  If this doesn’t work, you could try another build, or else give it some help from a CLI.  There are at least a few relatively simple ways to attempt this, described online.
Screen-resolution adapabilty is not what makes LXDE compelling, however.  Rather, it is other virtues.  LXDE is an up-and-commer, with many nice new features, and the promise that it will be especially good on low-wattage, low-resource hardware systems—like netbooks and tablets.  I like how LXDE is arranged in microKNOPPIX 6.4.4.  I am sorry to report, however, that where it comes to Lubuntu—or otherwise running on-top of Ubuntu at all (as with Linux Mint LXDE edition, which is still largely Ubuntu under the hood):  well, LXDE seems to not be very compatible—at least not yet.  [Actually, by the time you read this, I think it is very possible most of the issue will have been fixed.  Research it a little.]  So I would tend to think in terms of distros that AREN’T based-on Ubuntu, where it comes to trying to use LXDE.  At least maybe until sometime into the next release—like after, say, 12.04 (which is an LTS!) has been out for awhile.  Let’s give these people a little time, if we can, to get some of the kinks out.
Another factor which might intervene in the “Unity desktop storm” which is going right now is…….the Linux/Ubuntu * user * community.  Let’s not discount them, even if only for a ‘bridge”, stop-gap solution.  Google:  UGR (Ubuntu GNOME Remix).  Another thing might be “MATE interface for Ubuntu (fork of GNOME 2)”.  You could get a little info also on Cairo Dock for Linux.
MY BIG THREE, continued:
There’s another distro I should mention, which I just came-upon.  It’s called “AriOS-Linux”, and it’s from the middle-east.  This is an Ubuntu-based distro that apparently has been around for a few years, though it had a predecessor that went by a different name.  It comes with the multimedia and Flash stuff ALREADY INSTALLED, and is built with “out of the box newbie functionality in-mind”, so to speak.  (My own words, but they are in-keeping with the spirit of the distro’s creator.)
And what is more, AriOS seems to have a good solution to the difficulties sometimes brought-about by Ubuntu’s new Unity DE where traditional, full-sized machines (with full or near full-sized screens) are concerned.
It is further the case that AriOS comes with Nvidia and Ati graphics drivers * as * part * of * the * .iso * download *, for a change—a welcome departure for those of us with this equipment who have a poor internet connection.
I haven’t tried it yet, but I will attempt to update this post as soon as I do.
I do not want to turn this into an encyclopedia of desktop Linux distros, but I do not want to leave-out Easy Peasy Linux, either.
Easy Peasy is a community-developed distro built on Ubuntu 10.04 Lucid Lynx, and it is built specifically to run on NETBOOKS.  Indeed, it is not advisable to try to boot Easy Peasy on equipment other than a NETBOOK.  Easy Peasy was and is developed to run on the widely-available netbook computers, most prominently the Asus eee series and the Acer Aspire netbooks.
I have booted both EP 1.5 and EP 1.6 on my Acer Aspire One netbook, both times as a live-cd session.  It takes it awhile to boot from live-cd, but this is with no persistence configured.  If persistence is set-up, or the distro is installed to the harddrive, then the booting os can consult settings-files as to the particular hardware and screen size it’s booting on, instead of having to figure all this out from scratch every time.
I did not use it for very long, as I had other things which demanded my attention.  I could find no deficiencies with either build, though.  Easy Peasy is highly conditioned to run on netbook hardware, and has its own desktop configuration, which I think is a conditioned GNOME 2x, if I am not mistaken.  Anyway, it’s a nice, intuitive, usable desktop.  And Easy Peasy seems to have a good reputation, among world-wide netizenship.  There is support, too, because it is community driven.  It has its own forums, bug-tracker, and documentation—indeed, every Linux distro is supposed to have these features, if it is going to be included in the listing on DistroWatch.com.

RIGHT-click on everything in your Linux desktop—icons, links, other objects—esp. In Ubuntu:  in modern desktop Linux, you will find some very useful context-menus (“pop-up menus”) this way.  Take some time to play-around with your desktop.  First hover the mouse-pointer on stuff.  Then try a RIGHT-click.  Reserve some time just for a substantial play-session, if you possibly can.  When it comes to BOTH platforms—both WINDOWS * and * Linux, well, it is difficult to answer every question based only upon what you can read.  And there seems to be a minor phenomenon which we might characterize as “a great-many of those in the ‘Real Geek’ community find certain technology questions so obvious that they do not bother to publish solutions for them, either in the print-media, or electronically/online”.
So sometimes, we are just “better left to our own devices”.  A little trial-and-error goes a long way.
Where it comes to Ubuntu 10.04 (code named “Lucid Lynx”, as I may have indicated), a really great webpage for a newbie to consult is http://blog.thesilentnumber.me/2010/04/ubuntu-1004-post-install-guide-what-to.html .  If you go to this link, BE SURE AND GIVE THE WEBPAGE PLENTY OF TIME TO FULLY LOAD, as it contains many screen-shots and illustrations to help a newbie understand things about this distro of Linux.  This page has illustrated instructions for the set-up and operation of Lucid Lynx, and is written in plain enough English that an ordinary person should be able to figure it out.  I will emphasize, though, that before you do anything much after installing pretty much any version of Linux that has “Ubuntu” as part of its official name, and which comes directly from Cannonical, Ltd., you should install the so-called “Restricted Extras” package.  This can be done from the Software Center:  however, some knowledgeable people recommend that this be done from the Terminal.  If I am not mistaken, I believe the correct command (“syntax”) is:
sudo apt-get install ubuntu-restricted-extras
The author of the above webpage is someone I admire; but really, he does not come to this point-of-interest until rather further-on in the article.  Be sure and install this package, so that Ubuntu will work properly:  this is good advice—at least for those of us who are new to Linux, and aren’t prepared to shop for these packages a la’ carte.
Don’t be intimidated.  This is only one of very few Terminal operations you will be called-upon to do.
[Remember that this is intended for Ubuntu; other distros may have a different way of handling this.]  Just open a Terminal (in Ubuntu 10.04 Lucid Lynx click Applications > Accessories > Terminal); carefully type the above syntax (all lower-case).  Double check it, to be sure that you included all hyphens (-), and that it is rendered EXACTLY.  If it is typed correctly, hit Enter.  You will be asked for your password, which will only appear as a seies of dots when you type it, so be on-the-ball when you do this one little operation.  [Most of the rest of the time, Ubuntu lets us slack, and forgives our sloppiness and operator errors.]  Hit Enter again.  If you mistyped your password, it’ll give you as many tries as you need.  If you mistyped the command, it’ll give you an error message, and you can try again.
When you get this done, give the program a little time to execute, and then close the Terminal with the X in the upper-corner, just like you’d close any program shell/window.
In the 11.04 release of Ubuntu, if you are installing to a drive with Ubuntu’s installer, it will present you the option of adding the missing multimedia codecs at about the second prompt in the install process.  Add them.  This will enable you to watch videos off the web without downloading (though they always play best if downloaded), and other things you’ll want to do.
A few good links, where it comes to good, concise newbie instructions for desktop Ubuntu are:
http://blog.thesilentnumber.me/2010/04/ubuntu-1004-post-install-guide-what-to.html  (I cited this above),
and http://video.answers.com/how-to-search-for-files-in-ubuntu-210072108 ,
and http://www.howtoforge.com/the-perfect-desktop-ubuntu-10.04-lucid-lynx is good (or Google:  “ubuntu perfect desktop”; or “ubuntu perfect desktop how to forge”).  The HTF site (How To Forge) has a set of easy-to-follow, graphically-enhanced instructions on how to properly “complete” almost all of the major distros, and even separate instructions-sets for different releases/builds of your desktop Linux.  These instructions will take one of us noobs perhaps a morning to execute:  but it is well worth it.  These simple pages allow you to install needed codecs and similar coding which—though free to use—are patented/privately owned, and therefore cannot be bundled-in with your download of many desktop Linux.  The author tries to include everything you’d need to do to Ubuntu, Fedora, or what-have-you to make it a full and usable replacement for a WINDOWS desktop install.  I myself have followed most of his recommendations, and I can only report positive results, where my own equipment is concerned.
also http://www.youtube.com/watch?v=2qR591lh5Ow [this one is a nice, brief ( 8 min.) video tutorial on Nixie Pixel’s You Tube Linux channel]:  this is a nice channel to check-out anyhow;
and perhaps
http://www.psychocats.net/ubuntucat/  as well.  * Note that some of the posts at THIS blog are old enough (in computer-time) to be considered “archival”—by some people, anyhow.  Issues with Ubuntu described in some of this person’s pages have largely been remediated, by the time of this writing (late 2011).  Others are still somewhat of a problem, which is why this blog is relevant, and so I include its URL here.  Further, the author is a candid and well-intentioned person who wishes to help others with Ubuntu.  So sometimes this person is able to respond to a post you can make there, time permitting, of course.  And there is a good deal of other Linux-related information at this site, and ANDROID information.

An alternative to this recent “Desktop Environment weirdness” (GNOME as it appears in 10.04, or the new Unity DE) might be Xubuntu.
What is Xubuntu?  Xubuntu is just Ubuntu which comes with the fairly-popular XFCE Desktop Environment, instead of Unity or GNOME.  It has the added “benefit” of being officially sanctioned by Cannonical Ltd.—the “for-profit/not-for profit” company that creates Ubuntu and oversees its repositories and maintenance.  You can make of this what you will.  But I tend to think it’s a surer bet that Xubuntu will work reliably, than, say, a third-party contributed distro based-on Ubuntu—Linux Mint excepted.
Xubuntu was developed with the intention that it would be an alternative to regular Ubuntu which would require even fewer hardware resources, so that it would run on machines that were too feeble even to properly run Ubuntu.  Or, if you wanted to use it on a machine that had plenty of resources, it would be faster than regular Ubuntu.  It missed the mark on this.  But it * DOES * boot faster.  Often a * LOT * faster than standard Ubuntu—which itself is no slouch—standard Ubuntu has very good boot times.
Xubuntu also is very granular, where it comes to setting-up the os the way you want it to look, feel, and respond and/or be configured.  And the vast preponderance of this can be accomplished * graphically *, where it comes to Xubuntu, just like regular Ubuntu.  But it seems to offer you more settings/customization options.  Or that is the impression that I get.
A caveat, though, is that, while Xubuntu is an “official” release directly from Cannonical, Ubuntu was really built for GNOME, and now Unity.  So it seems some GUI stuff in Xubuntu is just “sticky”.  Try it as a live-cd.  For awhile.  Like for a good-while—like at least probably several days.  Or just try it as a thumb-key install, or a Virtual-Machine install.  This is good advice anyway.
In any case, there are many Ubuntu-based Linux you can try.  Go to en Wikipedia.org, and enter “List of Linux Distributions” in the search-window.
There is Kubuntu, too.  This is Ubuntu with a KDE Desktop Environment, instead of GNOME or Unity.  And Kubuntu seems very well documented.
Note also, in light of the above I have mentioned, that pretty much any of the known Linux Desktop Environments available on the web can be downloaded and activated in pretty much any version of Ubuntu or its variants.  (Or any other Linux distro, for that matter).  Whether or not this is a good thing for a * Linux newbie * to undertake, well, this probly depends on things like 1) which desktop-environ is to be installed to which distro; 2) how big a noob you are; 3) how determined you are; and 4) how much disposable time you have.  That’s why I generally stick with whatever the default desktop-environ is.  If I wanna try a different one, I look for a distro I can download as an .iso file, that already comes with that DE as default.  There are plenty of ‘em, and, as I think I already said, you can be doing something else on most modern computers, while the machine is downloading a couple of Linux distros.  Unused or unwanted .iso files can be deleted later, if you want the harddrive space back.
Frankly, you just might be better-off with Linux Mint.  Or one of the other, non-Ubuntu distros I have mentioned (Puppy Wary, Lucid Puppy, microKNOPPIX).  Linux Mint requires a little more computing power, however, than Ubuntu.  Or Puppy, or probly even something as muscular as SuperOS.  And sometimes this difference in “weight” is just enough to prevent Linux Mint from running properly on a netbook (like maybe one of the earlier ones), or some other relatively feeble hardware.  Standard Mint comes with an improved-upon version of GNOME, if I am not mistaken.  But there are several Linux Mint versions to choose from, some offering different DE s, and each with its own virtues.  [I am using Linux Mint 13 XFCE Edition as I write this, and I find it as easy to use as WINDOWS XP:  it has been my workhorse desktop system for some months now.  My only complaint is that it won’t connect to some of my favorite coffee-shops, in the middle-American city of circa 300,000 people where I live. ]
The “problem” I have with Linux Mint (I guess), is essentially that some of its directories may have been re-named in the re-hacking process, because its developer community builds it using Ubuntu as a starting-point.  And this might be enough to throw-off a newbie.  I guess you could try Linux Mint’s new (relatively new) “Debian Edition”, which is based directly on Debian Linux, instead of Ubuntu.
One advantage of Linux Mint is it comes with the third-party softwares necessary to watch video and fully browse certain websites, and so these key functionalities are already installed in Linux Mint.  With Ubuntu, you have to install Adobe Flash and some other stuff after you start using it, I guess because Ubuntu comes from a company (Cannonical Ltd.), though Ubuntu is given-away to individuals for free.  Ubuntu makes this very easy to do, though.  [Recall my comments as to the “Restricted Extras Package”, which you should install to Ubuntu from its Software Center soon after you get it up and running.]  Linux Mint, conversely, comes from a world-wide community of developers, and I guess there is some kind of a “governing board” or other delegation to keep things organized.  So a * company * CANNOT get-away with re-distributing third-party * proprietary * softwares (like Adobe Flash-Player)—or else it would need to present EULAs from those makers, and in a prominent way—a thing which most Linux developers seem to find unpalatable, on Philosophical grounds.  There may also be certain other complex legalistic issues at-play.  But a distributed * COMMUNITY * doesn’t seem to have a problem with this—though they present you with EULAs, of course.
I am not an Ubuntu fanboi, but it just seemed Linux Mint would be going-up another layer of complexity, so I started-out with Ubuntu.  And progressively, I got it to work better and better, learning more about computing and operating systems in the process.  But it could still be worth it for a newbie to give Mint a try.

I began noticing that when I would shutdown one operating system (on my Toshiba L515 laptop) and boot into the other, it would often run its drive-check program.  (I created a harddrive-install/dual-boot arrangement on this ‘good’ lappy of mine, as I may have already said.)  Booting WINDOWS 7 after an Ubuntu (10.04) session would often cause CHKDSK to run (though from the boot/welcome-screen—and it would only take a few minutes); ditto for fsck when I booted into Ubuntu after having used WINDOWS for my previous session.  I later found out that this is normal; both systems are supposed to run their drive-checks, every so many boots.
Unfortunately, the fonts had an episode where they “went crazy” in Ubuntu, after a prolonged session of some 5 hours or so, for no reason I can put my finger to—except that I had to get up to do something, and I had just re-swapped the chairs in my computer room, exchanging the one with the low back for the high one:  I have to do this every once in a while, as I have a back injury, and cannot use the same chair continuously.  Getting up later, I forgot how narrow the isle becomes with the high-backer, and I may have accidentally pressed some keys on the USB-keyboard that I have mounted on an articulated “bogie” on my keyboard-shelf.  This might have been when it happened.  (?)  I am rather sleep-deprived over the holidays, anyway.
It makes me wonder, too, if all those hard-shutdowns I did before I got Ubuntu running more smoothly on this machine might have damaged my harddrive.  I tend to doubt it.  But some blogs/webpages say this is possible. (These seem mostly to be old, though; rather dated sources.)  In any case, if I had just followed protocol, and actually kept reading the resources available on the web, I would soon have discovered that ACPI / battery-management (in my * laptop * computer *—* NOT * in Ubuntu) was the main problem, and have been able to easily remediate it.  But I was in too much of a hurry.
How ever it happened, I decided I’d better just unmount (shutdown) Linux.  [I had been using the mouse-click “file > save” in my Open Office Writer document prog. during the session—a habit I picked-up back  when I still had the random-freeze problem in this install of Ubuntu (because I was to ignorant and frankly imprudent to follow the instructions I found online, and set-up Ubuntu right the in the first-place.)]  And so no data was lost.
But when I booted back up into WINDOWS, I found some stuff very noticably out-of-whack.  My files were now displayed in time-order (instead of my default Name-order), and the open-action was now set to * one * mouse-click, instead of double-click.  Other things seemed askew, and I noticed it had taken WINDOWS 7 a little longer than ususal to start-up.
I set CHKDSK to run on the next boot, shutdown, waited a few min., and booted to WINDOWS 7…  CHKDSK ran as expected—ran with the “/f” for “fix” argument I had set.  It did not seem to find anything.  I logged in to WINDOWS Admin., and things seemed better.  So far so good.  I then switched-user to the user account I do most of my work from.  Things seemed better, but I had to manually re-set my files-view, back from time-modified to name, in list view.  Things seem better, but I notice the WINDOWS GUI desktop interface seems slower and buggy, until WINDOWS has been up for quite a while.
An external harddrive from which to run Linux seems like it might be my solution, but we’ll see.  I’m really light on extra cash right now (which is often my case:  my relatives say that’s ‘cause I’m a bonehead); so I’ll probly just use the Ubuntu install once in a while, and only for limited sessions.  As I may have already said, my Acer netbook is my real Linux project, anyway.  And I’ll get around to that this Winter.
UPDATE:  I booted Ubuntu again last night, so I could finish backing-up my downloaded video clips, as I had some difficulty using the interface of Nero’s free Kwik Media, and I’m still klutzy when it comes to burning with WINDOWS Explorer.  Ubuntu’s Brasero backed-up my remaining video without issue.  I even browsed with SeaMonkey while this process was running.  Ubuntu (10.04) seemed to be running better than ever.  When the backup finished, I shutdown for the night.  The next morning, I booted into my WINDOWS user again, and I noticed GUI/mouse-pointer bugginess seemed not to take as long to fade (somewhat), and I did not have to reset the files-manager.  I’m in the WINDOWS partition right now, and everything seems to be working right, and at almost normal speed.
UPDATE:  after 3 more successive WINDOWS sessions, Win 7 seems to have “healed-up”.  A few things still are “funky”, seemingly with the GUI.  It does not act as flawlessly as it did when I booted this machine for the first time (WINDOWS 7).  It’s close enough, though, so I don’t think I’ll bother trying to do  that “repair WINDOWS from disc without reformatting” thing I had planned.  Things are mostly normal (including the Paste Special dialog box in Word 2007), and, further, I have a lot of documents to scan to .pdf this week.  And I never bothered to get my hp 2500 scanner-printer working from Ubuntu, after I found a couple of the book-length hacks online.  I should have done more to investigate the issue, but I guess I didn’t phrase the query to Google just right, or maybe I’d have leaned more about Simple Scan, which comes by default in Lucid.  Anyhow, during the last session (this morning), I noticed that the WINDOWS systray icon that tells you about your audio said it wasn’t working, even though all my speakers were working normally.  GF.  And in the following session this afternoon (I had shutdown in between), I noticed that FireFox still toggles between minimize and maximize * JUST BY CLICKING ANYWHERE ON THE TITLE-BAR *.  (I do not mean the addressbar—I mean the TITLE-bar)—which is what Explorer (files manager) * was * doing, though now * it * is okay.
What exactly was the source of the SNAFU in the first place is a thing that seems destined to remain a mystery.  Logs-files could perhaps present some clues.  But really, I won’t bother, unless it gets that bad again.
[An added note:  by the time of the next WINDOWS session, Windows was back to normal, and just like its old self.  And even though I have gone back to the practice of regularly booting Ubuntu.  The two seem just fine now.  Wha hoppened ?? Beats me. ]
But it just goes to show you that dual-booting can (at least seemingly) cause one partition to affect another.  Maybe that is not what happened at all.  Maybe it was something else.  But it seems as if Ubuntu somehow affected WINDOWS 7.  GF.  Just further argument for live-cds and thumb-drive installs, which would probably be the way I’d go anyhow, if I had it to do again.  Another worthy option is an external, add-on harddrive, or a micro-drive (mini-external-harddrive).  These have been available for some time, but are now easier than ever, thanks to some recent advances and some free programs.   And then Linux is running from its own “disk”, and not from your (internal) harddrive.  See Appendix A of this document, c. pp 212.
I am compelled to add that, since having had this problem in WINDOWS 7, I found that trying to get help online with this WINDOWS issue seems to have been actually * more difficult * than finding help online for my * Linux *.  I have yet to be able to find more than one webpage that tells how to repair WINDOWS 7 from its backup cd, without having to reformat:  yet knowledgeable people have told me that this is imminently do-able.  I find Googling for Linux information can also sometimes be frustrating—but not as often as with WINDOWS, now that I have got the Linux-lingo somewhat down.
Something else I will insert here:
Even with 3 Gb RAM and a modern Intel dual-core processor on this good laptop here, Open Office Writer 3.2 vers. “whites-over” once in a while, when opening/closing a big document (.doc format), when I am working in Ubuntu.  This used to happen every once in a while on my old Vista tower computer, using Microsoft’s own MS Office suite.  In either one (especially Ubuntu), if you just let it alone for awhile (and maybe do something else), it will properly complete.  I have found that in Ubuntu, I can often open another program, or play a game when this happens.  On my Vista tower—even though I have another Gb of RAM and a better CPU—I always have to walk away from the machine—go and make coffee, or watch some TV.  It takes Vista and MS Word a while to solve it.
I guess I should add at least one caveat here, that I have experienced with Open Office 3.2 in Lucid Lynx.  I notice that in Office Word 2007, it does not matter if I drag the mouse from left to right, or from right to left, before I click the highlighter—I can still open the document on another system later, and remove the highlight if I want.  But in OpenOffice Writer 3.2 on Lucid, I had better drag from LEFT-TO-RIGHT, and then click the highlighter.  Otherwise it won’t let me remove the highlight later-on, without some fiddling.
If I manage to have the time to compile a custom kernel for this rig over the Winter months, I don’t think I will boot WINDOWS much anymore anyway, as I’ve gotten used to the look and feel of Ubuntu.  And my Ubuntu install is runnin’ * cherry * now, after all this happened.  Which seems * weird *.
But I notice that as long as I boot the 2.6.32-32 gen kernel, and with acpi=off, and using SeaMonkey and the other minor tweaks I’ve made (set visual effects in GNOME to 2-D/”no visual effects”), Ubuntu now seems to run about as good as WINDOWS 7 ever did.  It still seems to mess-up, however, if I boot into one of the older kernels, or one of the newer ones updates have added to my Grub menu.  Maybe I won’t bother to compile a custom kernel.  My Ubuntu is faster than WINDOWS 7 when it comes to opening and editing files that aren’t highly formatted, and a little slower where it comes to opening big files that are (OpenOffice Writer v. 3.2).  Closing a large, highly formatted .doc file used to be a chore for it too, but I have since discovered that if I hit file > save before I do it—giving it a moment to execute—it’s not a problem.  I think MS Office word 2007 can open .odt, but not if I were to send a .odt file to some person who has only Office 2003 (?).  So I guess I’m probly gonna stick with .doc for now.  I’d use .odt though, if it weren’t for this.  In a few months, Cannonical, Ltd. will release Ubuntu 12.04 LTS, which I’ve heard will come with Libre Office instead.  People on forums say this is a significant improvement.  [UPDATE:  it is.  LibreOffice 3 has solved all the problems I had. ]
I also notice that sometimes, when I add images to a document with Open Office 3.2 here in my Ubuntu partition (like some * screenshots * from a webpage), and I then boot into WINDOWS 7 a day or two later, M$ Office 2007 will only open the .doc with empty boxes where the images were.  Booting back into Ubuntu, the images are still there.
It’s also true that browsing seems a little slower in my Ubuntu install—even in SeaMonkey—than when I use Firefox in my WINDOWS 7 partition.  Webpages—particularly highly formatted ones—take a little longer to fully load.  On thinking about it a little, I posit this may well be due to  heavy traffic on the internet at peak times.  If and when I get time to play with GUFW (Graphical Uncomplicated FireWall), perhaps that may remediate the issue.  I might also need to clear cookies and browser-cache.
I will also add right here, that if I open FireFox (I think Ubuntu’s updater has recently upgraded me to FF 4.0 or something, from the 3.6 that came with the distro) and SeaMonkey isn’t already running and minimized to the systray (Tint2 bottom panel), then I still have the random-freeze problem.  (Update:  it upgraded FF to version 9!–I ran dpkg in Terminal.)  And free Video Download Helper (by Mig) works less of the time in SeaMonkey, than in FireFox (in FF it * always * works).  What works for me is to use SM to browse the internet, and download everything * except * video.  SeaMonkey comes with it’s own download manager, which is effective with webpages that offer a download-button of some sort.  If I wanna browse for video to capture, I just minimize SM, and launch FireFox, and everything seems to work-out just fine.  I watch what I like live in FF’s web-player, and D/L it if I want with FF’s Video Download Helper by Mig.  But a lot of these type issues would probly go-away if I just set aside the time to compile that custom kernel.  I’m gonna have to do it sooner-or-later—if for no other reason than to learn how.
Yet I put-up-with the handful of glitches and work-arounds in my Ubuntu partition, in order to get used-to Linux and so be able to enjoy its benefits—primarily the possibility of creating a very secure desktop configuration.
My full-size lappy still has difficulty with the dual-boot arrangement.  Even though I haven’t booted into WINDOWS 7 in over a week now, I find I still have to go through 2 or sometimes 3 boots in the morning, to get a “clean boot” of Ubuntu 10.04.2 here.  I put this down squarely to the fact that I’m booting from a drive that also has WINDOWS on it.  Maybe that is not a relevant fact, but I’ll bet it is at least somewhat relevant, when it comes to using desktop Linux.  Still, I have managed to do everything I was doing on Windows 7.  I have edited and created dozens of documents, listened to my playlists on Project Playlist, watched movies live off the web and downloaded some, burned discs (actually easier in 10.04 than in W7, using the Brasero that came already installed in 10.04—even though I’ve tried numerous freewares and also Windows Explorer burning); I receive and send e-mail, and I print documents.  If I get ’round to it, I may be able to get my hp 2500 series scanner to create some .pdf s for me, perhaps using Ubuntu’s included Simple-Scan app.  So I  * am * using Ubuntu for productivity, even though I have not had a lot of serious work to do this week.
For right now, I will add only that once in a while, opening certain webpages—often ones with a tremendous great deal of formatting and visual-effects—will cause a freeze-up, and I have to tap the power-switch, which I have set to shut it down through the kernel, so I no longer have to risk damaging it.  [I have set Ubuntu’s Power Management settings so that Power Manager will try to shutdown the system if the power-switch on my laptop’s case is pressed:  this attempts a “graceful shutdown”, which is much preferred over having to hold your finger down on the power-switch.  This is something I should’ve enabled from the beginning—as soon as I had installed Linux.]  I wait 20 seconds or so, and then tap the power-switch again, and it starts booting back up.
Even so, by remembering to click file > save frequently, I have not had any loss of the data I was typing.  (Remember I use Dropbox, and I find it runs every bit as good in 10.04.2 here, as it does in WINDOWS 7.)
And I just invested in a portable, add-on harddrive (500 Gb, 110 USD).  I intend to experiment with different desktop distros by using this.
MY EXTERNAL HARDDRIVE:
I had to barrow even this modest sum.  But I had no problems, it seems, in plugging it in while in Ubuntu (it’s USB-powered, so it is powered off my laptop); opening it as a folder; copying the owner’s manual and other included literature (.pdf s) to some backup media; and then shutting-down and installing Ubuntu (10.04.3) to it using the normal i386 live-cd.  I’ve only booted it a couple of times, so-far.  But it seems to run better than my 10.04.2 internal harddrive install (which Update Manager has now updraded to 10.04.3).  No random-freezing—even seemingly before I set its Grub to acpi=off.  (I BIOS-boot it of course, which by-passes the Grub installed to my internal hdd.)  [By the way, I notice my Ubuntu on my INTERNAL harddrive now identifies itself as 10.04.3 as well.  This must be because I regularly take the updates offered by Ubuntu’s Updates Manager.  I can run [code]      flyingfisherman@slakerzskull:~$ cat /etc/*-release
and I’ll get
DISTRIB_ID=Ubuntu

DISTRIB_RELEASE=10.04

DISTRIB_CODENAME=lucid

DISTRIB_DESCRIPTION=”Ubuntu 10.04.3 LTS”
During one of these early sessions, I downloaded G-Parted from the EXTERNAL hdd 10.04.3’s repos (since it wasn’t listed in installed programs in the Software Center), and then proceeded to slice-and-dice the huge 500 Gb harddisk to my liking.  (Be DANG sure G-Parted is pointed at the EXTERNAL drive, if you’re gonna do this, because you could over-format your WINDOWS install by mistake.  And there might be one or two “blue-moon” occasions when we need Windows in the future, even after we have gotten used-to Linux.  You could even over-format both Windows * and * your Ubuntu partition by mistake, or otherwise messing-up the configuration of the wrong drive, necessitating a re-format—so really watch yourself, and know what you’re doing, if you’re gonna use G-Parted.  It’s a good graphical program, but it requires some knowledge on the part of the user.  There are many instructions for it online, though; and there are many articles pertaining to partitioning and manipulating harddrives, generally.)
I shrank Ubuntu down to 60 Gb., then I created a 30 Gb. Partition.  And then I created a measly 3 Gb. primary partition I formatted to FAT 16 (to which I assigned the volume-label “fatty”), and this I intend to use to boot a frugal install of microKNOPPIX.  Next I created a third primary partition of some 60 Gb., formatted to .ext4.  Then I created a fourth that used-up the rest of the disk.
I then deleted this fourth partition, and converted it to an Extended logical-drive partition.  This is a general rule:  we can have up to 4 primary partitons on a disk; but if we want to have an Extended also, then the fourth primary has to be “sacrificed” to that.  I then set-about creating logical-drives in the new Extended partition, of 8 and 10 Gb. variously.  I formatted most to .ext4 for now, but I can change it later.  I think you only get to have 12, making for a total of 15 partitions in all.  But I didn’t make that many in this session.
In G-Parted’s “make new partition” dialog-box, there’s an up-down menu that allows selection of the desired format.  I clicked it and chose “linux swap” from the context that appeared.  I intend to use it, of course, as a swap-partition for the distros I want to install to this portable harddrive.  And then I created a measly 3 Gb. Logical-drive (to which I assigned the volume-label “swappy”), and this is the same size as my computer’s RAM-disk.  In G-Parted’s “make new partition” dialog-box, there’s an up-down menu that allows selection of the desired format.  I clicked it and chose “linux swap” from the context that appeared.  I intend to use it, of course, as a swap-partition for the distros I want to install to this portable harddrive.  And I intend to use the Grub that got installed to the disk with Ubuntu 10.04.3 as the primary boot-loader, and  daisy-chain the rest off that.
I guess a kind of “rule-of-thumb” I’ll throw-out here (though I’m no authority) is more-or-less as follows:
“One of the groovy things about Linux is that you can try stuff you’re not really qualified to do.  JUST MAKE SURE YOU’RE NOT EXPERIMENTING ON SOME DRIVE YOU CARE ABOUT, OR NEED TO RELY-ON TO DO PRODUCTIVITY (like your internal harddrive that has always harbored your Windows install).”  For those of us who’re “not yet so geeky”, installing Ubuntu or Linux Mint to our internal harddrive to dual-boot with Windows is probly as risky as we should get.  Thumb-key installs, too, can be about as risky, because there is the possibility of selecting the wrong drive by mistake.  But it’s still like they say:  “No pain, no gain”.  Or if you prefer, “If you wanna make an omelet, you’ve gotta break some eggs”.
I have successfully chainloaded Xubuntu 11.04 (an Ubuntu deriv) from Lucid 10.04.3’s GRUB.  I just re-installed 10.04.3 to the first partition of my external hdd (after making some changes to the partitions, and re-formatting some of them); no other os were currently on the drive.  Then I installed Xubuntu to the second partition (which I had re-made from 3 Gb. FAT32 to 34 Gb. .ext2, as I changed my mind about trying to boot microKNOPPIX from that location just yet—I need more time to figure-it-out.).  I made sure I pointed Xubuntu’s Grub at it’s own partition, during the install process.  [There’s a small separate window in the lower-part of the installer interface (the Ubiquity installer:  BE SURE TO NOTE THAT WHILE JUST ABOUT EVERY VARIANT/RE-MIX OF UBUNTU USES UBIQUITY, IN CASE YOU WANNA INSTALL THE DISTRO, THE INSTALLER IS GIVEN RATHER DIFFERENT APPEARANCE IN MOST VARIANTS; SO BUTTONS—LIKE FOR INSTANCE THE ALL-IMPORTANT OPTION TO SET THE LOCATION FOR BOOTLOADER-INSTALLATION—THESE CAN GET MOVED-AROUND, AND YOU REALLY HAVE TO WATCH.  Backout if you’re not sure:  but * DO ** NOT * “cancel” the install once you’ve let it start:  any mess you make by letting Ubiquity do its thing will be easier to clean-up than if you abort an install that has already commenced.)  Also, the installer wants you to re-format the partition, which is why I had the space ready as .ext2—so it could go ahead and re-format it to .ext4 when the installer began to execute.]
Then I shutdown, and BIOS booted the external hdd back-up into Ubuntu 10.04.3.  Once there, I ran sudo update-grub.  When the process finished, I closed the terminal, and shutdown.  Then I again BIOS-booted the external hdd, and used the arrow-keys to scroll through the Grub menu, when I was presented with it.  I found a new entry buried in there, labeled “Ubuntu 11.04”.  I knew this would be the Xubuntu 11.04.  I selected it, and the programs successfully chain-loaded into it.  I arrived at the Xubuntu desktop in a matter of seconds, and proceeded to install some stuff.  Everything seemed to work.  I’ll try and update this post, when I’ve had more time to play with the external hdd.
I am intent upon installing Mint to the external drive, so I can experiment with that and get the feel.  And probly also Ubuntu 11.10 with Unity (which I intend to tweak), and probly also Mikebuntu, and perhaps PeppermintOS, and FreeDOS, and Semplice Linux, and perhaps also Bodhi, MacPup, and maybe even Debian—stuff I’ve been wantin’ to try.  (We’ll see if I’m able to  successfully chainload all that.)    Probly also some rescue stuff, like BackTrack or System Rescue, and maybe Hiren’s boot-disk (if that one can be successfully installed at all).  And Ubuntu 12.04, when that reaches scheduled release.  Actually, I think I may just wait about 6 weeks—until, say, the first week of June:  give ’em a little extra time to try & de-bug a little more, and perhaps let ’em see what the Linux Mint ppl found (assuming the Ubuntu devs even have time to look at that).
I think I’m gonna call this my “magic USB-harddrive-of-distros and computer toolbox”.  MS WINDOWS pretty much wants to be in the first booting partition on the disk (and that of course has to be one of the four “Primary”-type partitions the legacy IBM/MBR/vFAT-harddisk-allocation-table—“schema” will let us create on pretty much any harddisk:  no matter even if it’s now relatively cheap to get a 500 Gb. one).  Linux, however, will usually boot just fine from any space on the disk—even a so-called “logical drive” of some measly 8 Gb. that you created in an             * extended * partition  (or maybe even less than 8, depending on what distro).
If you want to be able to run Windows as well as Linux (and we do), and somehow you cannot have it installed to the first partition, then sometimes it can be “fooled” by a technique known as “disk-swapping”, which I believe is a virtualized process.  Googling this will get you some relevant information, if you need to be interested in it.
Linux is versatile, in that it can be situated-on, booted and run-from pretty much any drive, disk, partition, and/or “logical”, formatted to .ext3 or .ext4, or btrfs or ReiserFS, or .ext2 (though the .ext2 one doesn’t “journal”); and it is often the case that live-Linux runs from the old FAT32-format developed by * WINDOWS *; and it is also true that most modern Linux distros will run (as live-boot) from * NTFS *—which is a volume-format WINDOWS invented and prefers!.
Note that the way you create a so-called “extended” partition is that you mark one of the Primary partitions on the disk as “to be deleted”—if there are four (4) Primaries.  If your harddisk has only 3 Primary-type partitions—or maybe just one big NTFS partition, then you can just shrink the existing partitions if need be, and then just click on the “unallocated” space, and then direct the disk-manipulation tool you’re using (G-Parted, Parted Magic, Easus Partition Manager, etc.) to “create an Extended Partition”, out of this “unallocated space”; when that’s done, you can create up to about 11 “logical drives” * inside * this “Extended” partition.  Each logical drive (“LD”) will act and behave just like any disk partition; the only difference is that you could not install another copy of ms Windows to such a “logical” partition—if, say, you had a cd of Windows XP Pro you bought but never used.  Microsoft Windows is built to only operate from one of these “Primary-type” partitions.  It * is * possible to circumvent this restriction, with a technique called “disk-swapping”:  but that is beyond the scope of this article, and it may violate Microsoft’s EULA (“End User License Agreement”).  So, to recap:  I think some partitioning tools allow you to just delete any partition with one click.  Then you click in what has become “unallocated space” to highlight it (or highlight it some other way); then you get your partitioning tool to re-assign this space as “extended”—in effect creating a new, extended partition.  If the partitioning tool you’re using won’t let you do this, try using Parted Magic (a kind of free Linux version of the famous pay-for program “Partition Magic”), or try another tool.
JUST BE DANG SURE THAT YOU DON’T DELETE THE PARTITION THAT HOLDS YOUR WINDOWS SYSTEM, OR ITS RECOVERY IMAGE.  Sussing this out can take some research on your part.  An easier route is probly to just go ahead and install a copy of desktop Linux (like Ubuntu) to your computer, using the “side-by-side”, dual-boot option in the graphical installer, and just going with the defaults.  This way, the installer will make the technical choices * for * you, and you can learn the rest later.  Any time you don’t feel confident (before you’ve actually clicked “install”, anyway), you can back-out, or just quit the program, by closing it.  BUT ONCE YOU HAVE ALLOWED IT TO BEGIN THE ACTUAL INSTALL PROCESS, TO INSTALL LINUX, YOU SHOULD NOT INTERRUPT IT.  It can be undone later.  Forcing a close of Ubiquity, Anaconda, or other GUI Linux installer once the process has commenced, can result in corrupted sectors on your harddrive, and you may not be able to get these back.
So, are there some risks to messing with Linux?  Yes, especially if you don’t read enough to have a descent understanding of any process you’re about to attempt, that alters drives or partitions.  You can even get your harddisk in a bind just by doing certain things in Windows, if you similarly misunderstand or bungle certain processes.  Linux doesn’t come without any risks at all.  Indeed, this principle is true of most advantageous opportunities in life.  But if you will read enough (here and elsewhere) to have a descent command of any new, potentially risky thing you’re about to attempt  on a computer, you will probably be okay.
The way I created an extended (which can be cut-up into as many “logical-drives” as you want) was that I selected the fourth partition I had made out of the remainder of the disk (about 400 Gb. in size), and clicked “delete” in the G-Parted menu.  This does not seem to really delete the partition, but rather just marks it as “unallocated”, so you can then change it to an “extended” partition.  So the extant formatting seems to stay in there.  In any case, it only took the program about 1 second to change the whole 400 + Gb partition from Primary to “unallocated”, so I’m assuming it does not go to the trouble to change the underlying format until you give it another command involving the type of format.  I think I then selected it again, clicked on “partition” on the G-Parted bar, selected “new”, and then moved the slider to shrink-down to 20 Gb., and then clicked “logical”; and then typed a name for the space in the window; then “ok” or whatever it says, then click “edit” from the top-bar; then “apply all operations” (that you have set-up), and then “confirm”; this created a logical-drive inside the “extended”.  I did this twice more.  Linux can boot from a logical-drive.  So can just about anything else:  the exception is Microsoft products, which are built to require a “Primary-type” partition from which to boot.  But the rest of the “operating environments”—including things like MemTest, KNOPPIX, Portable Firefox and FreeDos—don’t seem picky.
I will make some of these “Logical Drives” about 6 or 8 Gb.—I am told that that is about the minimum, when it comes to a normal harddrive install of most desktop Linux distros.  From my own experience, if you want to actually * do * some stuff from Ubuntu (as opposed to say, just using its Grub for bootloading multiple other os on the disk, or playing with a new interface from a fully-installed environment—though most people use a live cd, thumb-stick, or VM-container for this latter), then most releases of Ubuntu (except “trial builds”—like Daily Builds) require about a minimum of about 7 or 8 Gb.  But this is speaking * generally *.  Much depends on just exactly what you are wanting to * do * with the os.  So remember that the file-system on the live-cd is in compressed form, and it “de-compresses” when you run the installer to put it on a harddisk.  Many releases of Ubuntu will de-compress to about 2.8 Gb.  But as soon as you add the waiting updates, and the Restricted Extras package (necessary, if you want You Tube and MS fonts to work, Java-functionality, & rel.), you’re at something like 4 Gb.  Add SeaMonkey suite or some program you want to use but which didn’t come in the default system-image, and you might be pushin’ 4 + Gb., total volume size.  It depends.  Research it, if you’re not sure.
I am gonna have fun playin’ with this external harddrive.  We’ll see how it goes.
I guess I should perhaps mention here that when I boot the EXTERNAL drive, I BIOS-boot it, just like a bootable thumb-key, and I then am handed-over to the Grub of Ubuntu 10.04.4, which resides on the first partition of the drive.  It is this instance of Grub which controls the MBR of the External harddrive—I am not using the Grub that is installed to my INTERNAL hdd, which came with Ubuntu 10.04.2 back when I created the dual-boot arrangement with * it *, by installing side-by-side with WINDOWS.

DRIVER-ROLLBACK IN UBUNTU
Unlike in WINDOWS XP/Vista/7, there does not seem to be a driver roll-back feature in Ubuntu and its variants.  At least not as “conveniently” as in WINDOWS.  And there seems no way to have granular, graduated control of the updates Ubuntu’s update manager wants to install, except going through them one-by-one.  You are presented with the option to not install any of the updates each time, by un-ticking the box before it.  Linux Mint, by contrast, segregates updates into four or five categories, and sometimes gives some comments about those that might be a problem.  I will say that I have yet to have Ubuntu’s updates break something, like some piece of software in the system.  And my Word jump-list in my WINDOWS 7 has stopped working all by itself, so I now have to use the workaround of searching for those files by time-stamp, when I’m working in my WINDOWS 7 partition and I can’t remember all of the docs I had open yesterday.
[there is also a good thread pertaining to this at http://www.howtogeek.com/howto/29584/safely-remove-ppas-and-roll-back-to-stable-versions-in-ubuntu/  ]
One way to roll-back drivers, I am told—in Ubuntu  * and * its * variants * (especially Xubuntu and Kubuntu—this may be riskier in other variants)—is to install a utility (from Terminal) that will allow the purging of PPA drivers (and probly any and all PPA stuff you might have installed).  It goes like this:
[ Reverting to Default Drivers ]
You may revert to your original distro driver versions. Simply install the following package:  (in the Terminal:)

Code:

sudo apt-get install ppa-purge

… and run the following command:

Code:

sudo ppa-purge xorg-edgers

Answer ‘yes’ to everything and the system will roll back the drivers. This will also disable the xorg-edgers repositories in Synaptic, so make sure to enable them again if you want to try different drivers in the future.
Remember that this will probably only affect PPA drivers and softwares. PPA means “Personal Program Archive”.  In plain English, this boils-down to “some Person  has written a driver (or other Program), and put it in a server-Archive”.  For any of us to download and try, if we want to.  Ubuntu and most distros are arranged so that you have to go through a few extra steps to download and install anything from a PPA.  And it is like this for a reason.  Nonetheless, this (above) utility and its command may restore your default driver settings, because, say, the ATI driver you downloaded for your AMD-made graphics setup may fall under the classification of a “PPA driver”.  Or it may not change things on your system at all, because it detects no PPAs.  In any case—as with all of Linux and its associated softwares, if you’re going to use it (or any advice at all from little-old-me), you do it at your own risk.
Source:
http://www.overclock.net/t/621497/how-to-easily-install-the-latest-ati-open-source-video-drivers-under-ubuntu

Or, perhaps more sensibly, one could just uninstall the driver, after perhaps having found the old one, and placed it in a temporary directory.
I am informed that another way to work this (sort of), with respect to your GRAPHICS, anyway, is to restore xorg to default.  Again, this is aimed essentially at Ubuntu, and probably at recent releases (roughly 9.04 through 10.04).
This will explain How to remove,install and reconfigure xorg, without reinstalling Ubuntu.  This is very useful if you mess up with your xorg file.
Open terminal and run these commands

Remove existing xorg using the following command

sudo apt-get remove –purge xserver-xorg

Install xorg using the following command

sudo apt-get install xserver-xorg

Reconfigure xorg using the following command

sudo dpkg-reconfigure xserver-xorg

I hope this helps some users to fix their xorg problems.”

Again, as with anything I have to offer here, you should research it some, before you actually do it.
Source:
http://www.ubuntugeek.com/ubuntu-tiphow-to-removeinstall-and-reconfigure-xorg-without-reinstalling-ubuntu.html
[ Check-out BackInTime, FlyBack-for-Linux, and MondoRescue.  Also Deja Dup (pronounced “Day-zhah-Doop”).  There are other utils that may stand-in just fine for WINDOWS’ Backup and Restore.]
29. Entry 29:  HOW GOOD IS YOUR MEMORY?        Be aware of the hardware-system requirements for the distro you want to try.  Especially if you’re planning to use your Linux from a netbook, notebook, or tablet.  Learn the minimum (and the recommended) RAM and CPU requirements.  These are available from the distro’s home page.
Quite frankly, Daddio, while we’re bein’ frank here, and also ernest, I feel compelled to inform you that, while Linux is supposed to be famous for its ability to run on older computers without good hardware resaources, it seems to be a F.O.T. (Fact Of Truth) that using a swap-file or swap-partition is just not a substitute for sufficient RAM.  Okay, what the hell is “swap”?  Swap is just the Linux-way of saying Virtual Memory.  Virtual Memory in WINDOWS is just a pre-arranged way for your RAM chipset to get your harddrive (or some other device with some spare computing space) to “help it out”, when you the operator start some big process, like, say, opening some big app.
Virtual Memory is a practical necessity in WINDOWS.  It is also arguably one of the reasons WINDOWS is less secure.  If you want to encrypt some files, for instance, you had better also encrypt swap, or the purpose of the encryption may well be defeated.  And Linux swap that is set to come out of a USB-stick can be very slow.  WHERE THE SWAP SPACE IS TO COME OUT OF AN INTERNAL HDD, HOWEVER, IT IS NOT SLOW, but pretty quick.  But this misses my point.  Further, if any Linux has ever been installed on your computer’s harddrive—like probably ever—then most Linux distros that you decide to run from a live cd or otherwise experiment with, will probably “find” some residual “nix”-formatted space (even if only 10 Mb), and use it for swapping.  This can be controlled by appropriate commands.
I really feel we are better-off to just invest in a RAM upgrade.  I find it makes Linux run better, and will also make your WINDOWS install run some better.  It can be easily undone, too, on most machines.  Memory chips to-day are just not that expensive.  A lot of us just plug-in the new chips ourselves, and there are plenty of tutorials on the web as to how to do this.  Be sure you ground yourself first, in that you grab something with your free hand that will transmit any static electricity that may normally reside in the human body, to the earth beneath your feet.  This could just be a metal table, or something that will dissipate any charge.  I use an electronics-hobbyist’s wrist-strap, with the curly telephone handset-type cord attached, and the aligator-clip on the other end of that I attach to the steel bedsprings beneath my mattress (I have the old-timey kind), which in-turn is connected to a steel rod driven into the ground outside my bedroom window, and this with a 20 AWG length of utility-wire (it was a piece of scrap from my day-job).  If you give your PC a good carpet-shock while working on it, it may never be bootable again.  Some of the internal components tolerate as little as .5 volt, which is only ½ of one volt, and at a few millionths of an ampere.
Pre-owned memory chips can be had on the web, and for less.  Being used does not seem to affect their performance.  It is important to obtain the ones that have the right speed and other specs, and which will therefore be compatible with your machine and its mother-board.  If you can’t make it work, there are a lot of computer shops.  Some of us just go straight to the computer shop.  As with any service, research the shop’s reputation.  A RAM upgrade nowadays is just not that expensive.
I guess I will add that I run Linux on four of the five computers I currently own, and they do the things I want just fine with no RAM upgrades—though the old Pentium 3 notebook was upgraded before I got it.
Maybe this is a place to also add that I could not get Linux to boot on one of my neighbor  Jim’s custom-built towers, nor his old 512 memory Dell tower, nor my neighbor Diane’s new e-machines tower.  I was only given the opportunity to try a few of my discs, though.  But then the BIOS controls were so “funky” on Diane’s that I could not even find “boot from cd”, so I never really got to give it a try.  So  that one doesn’t count.
I might have been able to get Linux up-and-running on at least one of these, had they allowed me enough  time with the equipment.
30. Entry 30:  NVIDIA CARDS:         If your graphics card is Nvidia and you decide to use Ubuntu, you may have to go through some extra steps to get the system working.  But the Ubuntu people have taken pains to facilitate this, and make it as easy as possible for you.
31. Entry 31:  HOLY KALEIDOSCOPE BATMAN!  I JUST BOOTED MY LINUX AFTER INSTALLING IT TO THE INTERNAL HARDDRIVE, AND NOW THE SCREEN IS UNREADABLE!!
If this is the case, the distro is probly having an issue with auto-adjusting the screen resolution.  If you hold down ctrl and then also alt at the same time, and then with your other hand press F1 (in Ubuntu and its variants, and many KNOPPIX versions), you should be taken to an X-window, which is like a layer beneath the command-line/Terminal.  From there, you can use the non-graphical environment to fix problems, such as adjusting the screen resolution.  I’d just try a few re-boots first.  Often Grub gets installed with a Debian-based distro with more than one kernel-option to boot from.  There may be another kernel in the Grub screen that will be copasetic with your mother-board.  You can shutdown Linux from the X-window (or from Terminal) by typing “sudo halt” & hit Enter (without the quotes).  In the Puppies-Linux, I think it is “shutdown -h”.  In some versions of KNOPPIX, I think it might be ctrl + alt + F5, instead of ctrl + alt + F1, but you’d better check that.  A few distros out there might still require “shutdown -h now”.
32. Entry 32:  Most of the more popular Linux distros have a Wikipedia article about them, which is worth at least a perusal.
33. Entry 33:  At least have read-through the release-notes of the particular distro—especially if you decide to use it on a permanent basis.
34. Entry 34:  LINUX STELLAR PRIVACY AND SECURITY:        Linux is already very secure, just like it comes.  But even this can be easily improved.  More about this in succeeding entries (and Appendix A, c. pp 212), or perhaps I will just devote it to other documents.
35. Entry 35:  OTHER CONTENDERS (for noob-friendliness) would probably be Crunch Bang Linux [often abbreviated as “#!”–a compound symbol sometimes used in BASH/UNIX code to denote “I mean everything (below this)”, or “the whole she-bang”].  At least versions of Crunch Bang prior to its being based directly on Debian, instead of being based on Ubuntu, which in turn in based on Debian.  If you can get the newer “Debian only” releases to boot on your machine, great!  But I lost interest in CB, after I couldn’t get the Statler release to boot on any of my equipment.  It would hang during boot every time.  On all three machines.  It is also worth mentioning that Crunch Bang uses a * very * spartan default desktop environment (Openbox, with some bits of LXDE).  Which makes the DE able to run on a * huge * variety of hardware.  If you can get the underlying distro to finish booting.  Openbox is pure black-and-white, and there’s no windows-like “Start” button displayed by default—instead, you access menu by a simple RIGHT-click anywhere in neutral space on the screen.  This can throw some of us Windows folk, trying to get used to Linux.  It’s worth mentioning that CB, like most current and former Ubuntu variants, comes with the Flash, Java, & rel. stuff (“Proprietary codecs”/”Restricted Extras”) already installed.  Next, I’d say MEPIS Linux and its sub-distro “MEPIS AntiX”.  Also probly Linux Mint.  Some say PinguyOS.  This latter is a newer one, but it is looking pretty good so far.  I am going to check into it further.  I need to find out how good support is for PinguyOS; it might be very good, but I have not had a chance to research it just yet.  I will probly confine this blog mostly to the “big three” in entry No. 28 (above), because as my aunt used to say, “That right there is enough for now, kids”.
36. Entry 36:  TRY A FEW DISTROS AS LIVE-CDS, TO FIND THE ONE THAT’S RIGHT FOR YOU AND YOUR MACHINE:    Remember that Linux distros to-day come as a download that can be burned to a cd or DVD which will then run as a “live” environment—it will run off-of the cd, needed files decompressing into the computer’s RAM as needed (except for Puppy and a few others, which have such small file-systems that they load fully into even a 512 RAM).  So you get to experiment with the Linux distros of your choice, and not have to install one to your harddrive (or thumb-stick) until you have found that it seems to run well on your particuar hardware.  Get in some use-session hours with a live cd/DVD, before you let yourself become confirmed in the opinion “this is the one for me and my computer”.    Except for Puppy and KNOPPIX (and a few other distros), Linux distros are (* generally *) made to run from a harddrive, and the live cd is propagated for the purpose of letting you experiment with it on your machine and its particular hardware, for perhaps a few days or a week.  Running from a thumb-key, however, is different again, and can be just as fast, depending on how you set things up.

It may help you to first try some releases of desktop Linux (Ubuntu, KNOPPIX, Puppy, PinguyOS, etc.) that were released in about the same time-frame in which your machine was manufactured. Often this is helpful. Let’s remember that a Windows computer is generally associated with the time-frame in which its version of the Windows operating system was current. A Windows 2000 computer will usually have been manufactured around that time; an XP computer will usually have been built somewhere between 2000 and 2008 or so. Machines sold with Vista were usually built around 2007 or 2008. The dates for any-and-each of the various Linux desktop distro’s final release is a matter of public record, and is easily discerned from the distro’s online documentation.
NOTE that any live Linux disc (cd or DVD) is gonna be  S – L – O – W – E – R  than when/and/if Linux is installed to and running from some other media, like an internal harddrive, or an external/add-on harddrive, or even a USB thumb-stick.  Slower to boot-up, and slower in running.  Not having enough hardware resources/”computing-‘horsepower” can also cause Linux that is being tried from a live cd to be slow, or to “hang”.  This is just something you have to suss-out.  Puppy Linux, however, is rather an exception to this, because Puppy is one of the comparatively few distros that is small enough to have ALL of the File System loaded into the computer’s RAM as the boot process finishes—and Puppy is pre-configured to do so.  It is because of this feature that Puppy Linux will run just as fast from a live-cd as from any other boot-media.  There are other distros that are like Puppy in this way, but they are comparatively few.  KNOPPIX also—though it does not load all of its FS into the RAM—is somewhat exceptional in this regard, as it normally runs as fast from a live-media, and the creators discourage harddrive installation anyway, because KNOPPIX is made to run from a disc or a thumb.  A VERY IMPORTANT NOTE/UPDATE TO TACK-ON HERE, IS THAT MOST USERS (by the time I’ve had a chance to insert this) ARE NOW PROBABLY SKIPPING THE * BURN * THE *CD * PART, BECAUSE THE USE OF WINDOWS-BASED PROGRAMS LIKE UNIVERSAL USB CREATOR from pendrivelinux.com, et al (free!), ARE INCREASINGLY PERFECTED, AND SO YOU CAN TRY-OUT DESKTOP LINUX FROM A BOOTABLE USB KEY YOU MAKE IN ABOUT 15 MINUTES, AND IT WILL RUN AT VERY NEAR FULL-SPEED!  This is now a * better * way to test-run desktop Linux, and it will usually install just fine from the USB-drive, if you decide to do so.  Further, it usually won’t hurt the USB-thumb drive (“pen-drive/jump-drive”), and you could re-use it later, for other stuff, just by re-formatting it back to fat32.  One caution about this method, though, is that you do run (some) risk of formatting the Linux distro to the wrong drive, if you’re   a) really inexperienced, or   b) really tired.  So be careful!  Or just stick to trying desktop Linux from a cd/DVD.  CDs and DVDs are still good, though, to make backup-copies of all your text, documents, batch, audio and video files (and your operating system(s) too), once you’ve installed the systems and programs you want to your disk.  Always make backup-copies of your Windows operating system on cds/DVDs as soon as you get a new (or used) computer, and then once again before you mess with Linux, BSD, or any other system.
Note that you can, if you wish, just order most of these Linux builds from Amazon, or otherwise online.  Most of the major distros offer certified, “official” cds and DVDs.  As with anything you’d buy online (or anywhere), research it some first, and find out about the source/seller.
When it comes time to choose a distro with the intention of permanent use, I prefer to try installing it to a thumb-key, and try running it from that.  If it works good, then that’s the way I’d continue to run it.  I prefer this, generally, to installing the Linux to my hdd, and I find it to be the way of the future.  (It is also true that a lot of people run Linux as a “virtual machine”, using virtualization software, such as VMWare.)
[The BIOS of some machines (especially older ones) does not have the option to boot from their USB hardware port/software interfacing.  This can be overcome by learning to make a special type of “boot-helper cd”.  You could look at the file “l boot from usb when bios is not able”, or Google the issue if it crops-up for you.  Sometimes this lack of USB-connection bootability can be remedied by just updating the BIOS’ firmware (the miniature operating system that drives BIOS).  This is sometimes referred-to as “re-flashing the BIOS”.  Better let a professional tackle this one, unless you’re already WELL on your way to being a hardware tech professional.  If a BIOS re-flash procedure doesn’t come off right, the machine will probly not be able to boot anything again, until it is straightened-out.]
Remember that your computer is not like a toaster-oven or even an automobile:  a model 8800 xy of some toaster-oven may be essentially identical to all the others of that model produced; but this is not true of most modern personal computers.  Some models do not even use the same mother-board, from one production-run to another.  It is the nature of the business:  computers become more and more powerful very quickly, and so the pressure is on to rush a production-run to market, as soon as possible—whether the same components are available from the vendors used in the last run or not.  So just because it says “HP Pavillion Slimline” on the outside, that does not mean that all the critical components on the inside are the same as the one in your neighbor’s living room.  So the same Linux distro may not run smooth on both of them.
So you may have to try several live Linux cds, until you find one that recognizes your graphics and network cards, and most of your other hardware stuff.  [I recently stumled-upon a page that gives us indication of     * which * WI-FI cards are Linux-supported.
https://help.ubuntu.com/community/WifiDocs/WirelessCardsSupported ]
A GENERAL DISTRO RULE OF THUMB: if the distro fails to recognize more than two of your hardware pieces (say, the internet, sound, and screen resolution), you’ve got the wrong distro.  Don’t fight it.  You can if you want to, but it’ll just be frustrating.  Just move-on to another live cd.  Yeah, Linux can be a time-consuming, trial-and-error process.  There is a “distro-finder” site on the web, which lets you answer some basic questions about your computer (make and model, whether you intend to use it as a desktop or server, etc.), and it will then “compute” a short list of the distros you should try.  I recommend this site:  I find it to be a help, and a good first stop.  It may be the only stop you need to make.
http://www.linux-laptop.net/
37. Entry 37:  USE “LOCAL MIRRORS”:        Before you go-ahead with downloading any Linux distro, you should try to pay attention to where the source of the file is located.  Let’s try not to download Linux from a server in Hong Kong, if we are in Chicago.  Try to find a closer site—at least one that is in the United States first.  That’s what they mean by exhorting us to “use local mirrors”.  If you live in Saint Louis, a mirror-site in Texas or even California, can be considered local, for this kind of d/l, at least.  I downloaded a huge Linux .iso over the regular internet (with a broadband conn.) all the way fron Germany, and not too long ago.  Because I still could not find the file of the KNOPPIX DVD I was after on servers in the U.S., I guess because of the “October surprise” some black-hat hackers handed to Kernel.org, and I think one or two other Linux orgs this autmn (2011).  The checksum still matched, so I went ahead and burned the .iso to a DVD-R.  But you know, the longer you make the pipe-line, the more chance of such a coplex file as an operating-system being corrupted in the process.
38. Entry 38:  USE THE CHECKSUM:    Before you are about to burn a Linux cd (or DVD), md5-check the downloaded file for integrity first.  There are instructions on the web and in here as to how this is done, and it is not hard.  There are special GUI apps just for this, that run in WINDOWS.  I like Hash Calc 2.02, because it runs in any version of WINDOWS, is lightweight and simple, and can be downloaded from relatively secure sites like cnet/download.com, so there is less chance of infecting my WINDOWS partition with malware when I run the installation.
If the checksums don’t match, throw that file away and start a new download.  I think it took me three tries one time—over a particularly long weekend—to get a clean download of Linux Mint.  I can’t even remember now, just which version of Mint it was.  I don’t think I ever used it, because I came across a compelling article about “MicroKNOPPIX as your Next Desktop”—or some-such.  But this illustrates my point:  you want to at least have some eviidence that the file is not corrupted, before you waste a cd on it.  A corrupted os file will not boot—or if it does, it will be a screw-up when you get to using it.
39. Entry 39:  BURN ON SLOWEST SPEED:    When you set-about to burn a Linux distribution to cd, set the burner on the slowest speed it can be set-to, so that you get a deeper, “cleaner” burn.  Some will dispute this, and say it doesn’t matter; but it is a well-known “trick of the trade”.
All of the major Linux distros and nearly all of the minors are nowadays offered as a freely downloadable file, and when that is burned to a cd or DVD this file will become a bootable live Linux disc.  There are several files-type in this category, but I prefer the .iso-file.  It is complete, and its integrity is verifiable with any one of several “checksum utilities” that are downloadable to Windows, and are free.
You probably will not be able to sucessfully burn your Linux live-cd using Windows’ own burning utility, like you would a music cd or a DVD.  And I don’t know if popular third-party software like Nero will do it.  It is most often said that we need to download and install a small program called an “iso-buster” (or “iso-burner”) to Windows.  There are several free ones online.
You should do some research, though.  Because just like any download to Windows, there is a hazard of malware and spyware.  Get some user reviews of the program you intend to use.  Make sure your anti-virus software is up-to-date, before you download it.  Always create a restore-point first.  Try to download only from reputable sites.  If you right-click  any downloaded file before opening it, the context-menu that pops-up will usually give you the option to scan the file with your anti-virus software.  Malwarebytes lets us do this too.
I like Active@ iso burner.  It is simple, and only performs one function—burning iso files.  This makes it easier for those of us who are not really savvy with a computer to burn the right file, instead of getting the wrong one by mistake.  It will also adjust to burn on a slower speed, which is desireable.  The controls are simple and intuitive, unlike some iso-busters I have tried.  Some sources online say that this one contains spyware/malware, but I have been using it from the user account of my Windows 7 for a long time now, and can find no evidence of this.  And I always seem to get a clean burn of Puppy, or Ubuntu, or Linux Mint, microKNOPPIX, Crunch Bang, or whatever distro of Linux I select.
40. Entry 40:  TRY THIS ONE FIRST, IF ONLY TO GET YOUR FEET WET:    In any case, what I recommend is that you try Puppy Linux, and that may be all the Linux you ever need.  And it can be made very secure.  Linux is already very secure, just like it comes.  More about this later.
41. Entry 41:  DON’T GIVE-UP:    If Puppy can’t be made to work-out for you in a little while, and to suit your usage needs, I recommend downloading and burning the Linux mentioned above, probly in the order mentioned, and trying one live-cd after another, until you find one that recognizes your graphics-card and network-card (for sure), and then (I would hope) your sound-card, and wifi, and the rest of your hardware.  The “big three” I have listed recognize about 90% of all hardware between them (if you mathematically average them together, I am saying), so you are already off to a good start.  One of these will almost surely successfully interface with the totality of your computer’s hardware.  If not, you have the options of 1) giving-up, or 2) keep trying.
You can delete the files of the distros that didn’t work from your hdd later, if you want the space back.
42. Entry 42:  IF THE BROWSER WORKS…    If you can open the default browser included in the Linux, and surf the web, you know your networking is supported, and will probably continue to work (at least over Ethernet), even if you install the distro to your hdd (or, if you’re like me, to a thumb-key, because I prefer running it from a thumb-key, and nowadays this is easier than ever).  Another worthy option is an external, add-on harddrive, or a micro-drive (mini-external-harddrive).  WiFi can be a bit different.  Just because it ran for a little-while from a live-cd without dropping your connection and then you having to hassle with re-establishing it, does not mean that good behavior will continue if you get to using the Linux on a regular basis.  This is just something you will have to test by trial-and-error, over time—say a couple of weeks of spooradic use, and from one or two different hotspots.  What I prefer is just to do a full install to a thumb-key, which treats the thumb as though it were your hdd, instead of a cd.  Thumbs are not that expensive now.  A 4 Gb is often enough to play-with, for this purpose.  I have several (SanDisk Cruzers), and I have re-formatted several of them multiple times, with no detectable ill-effects.
43. Entry 43:  USE BACKUP MEDIA:    Before you undertake to install a Linux to your hdd or even a thumb-key, or to partition a drive (even a thumb-drive), or to do any other “radical” operation to your system, backup all data to cds or DVDs, and preferably to at least one other source, like a cloud-service.  Many cloud-service will give you 2 Gb free, and for an unlimited period—maybe for life—because it is their marketing.  They want you to get to like the convenience, and then upgrade to a pay-account with more space, which you never have to do if you don’t want to.  DropBox.com is good.  If you use ms Hotmail—even if you just open an account and use it as a spam-box—ms gives you your own private cloud, and of a whopping 25 Gb for free.  How can they do this?  Hey, if * you * had about a trillion dollars, you could do stuff, too.
44. Entry 44:  DON’T READILY USE “COMPUTER JANITOR”, OR EQUIVALENT PROGRAMS.  If you choose Ubuntu (or any other distro, for that matter), don’t use the “Computer Janitor” feature, except as a last (or near last) resort.  This program is essentially the equivalent of a WINDOWS registry booster.  And just as with a registry booster in WINDOWS, it often does more harm than good.  In whatever case, LINUX JUST DOES NOT FILL-UP WITH MALWARE OVER TIME, and because of its different files-stucture and write-to-disk method, LINUX FILES DO NOT BECOME CORRUPTED WITH TIME—the way files can in Windows/ms.  In all fairness, Microsoft has done a great deal to remedy this in recent years—really even going back all the way to the mid-ninties, and the replacement of vFAT with the NT stuff.  But the BOTTOM-LINE here is still along the lines of LINUX’S FILE-SYSTEM BEING A LOT LESS IN NEED OF A “JANITOR”.
If you think there is a problem of this nature (files corruption), I think I’d look into the “fsck” utility.  You can Google that, and get plenty of pertinent info.
45. Entry 45:  LAPTOPS AND MOBILE DEVICES ARE A BIT OF A DIFFERENT CONCERN, WHERE IT COMES TO RUNNING LINUX ON THEM:    But most of us don’t let this stop us.  When it comes to laptops, netbooks, and mobile computing devices (“moby”), know that 1) you had better check the hardware requirements for any Linux distro before you try it, as “moby” devices have less computing horsepower, generally, that do fixed, non-portable machines like a “tower computer”:  there is plenty of this information available on the world wide web, and it is pretty reliable, because people in-the-know check one-another; 2) mobile computers have batteries that they can run-off-of, and so nearly all have some power-management/battery-management system installed, which is another layer of complexity, and the source-codes of these programs are not only often kept secret by their creators (like firmware vendors to the big computer manufacturers), but there are so many variations in make and model, that the Linux communities have a very difficult time test-running all the Linux distros on all of them.
This can create a situation where a Linux distro will boot and run fine the first time (especially if you’re tryin’ it from a USB-thumb instead of a cd), but then it won’t boot and/or run right the second time you try.  SO YOU SHOULD TRY MULTIPLE LIVE-SESSIONS, TO SEE IF IT MIGHT BE THE ONE FOR YOU AND YOUR COMPUTER.  Even if you don’t do all the research first, a live cd or USB-thumb is not at all likely to do serious harm to your machine, unless you are just careless.  Most mobile devices (at the time of this writing, late 2011) use either the ARM or ACPI program to manage battery life, usually the latter.  A fair amount of info about these is available on the web.  But the deeper I get into Linux, the more I come to think that the way most Linux users deal with this is just to use boot-parameters.  Most problems with a laptop/mobile install of Linux seem to be ACPI/power-management related.      This is most often dealt with by means of boot-parameters (“cheat-codes”), or so it seems.
Note that Ubuntu’s Startup Manager is the easy (graphical, point-and-click) way to set/re-set boot parameters.  I think some other distros have a similar graphical tool.
KNOPPIX is famous for its boot-parameters (“cheat-codes”), but really, most any Linux can be booted with parameters that work with the particular distro.  Most params will work across a large range of distros.  Again, research this on the web.  Google is your friend.  Know that the param “ACPI=off” is rather like using a sledgehammer to kill a fly; it essentially lets Linux run on a mobile device as though it were a tower-computer, for that session.  (Or for every Linux session, if you edit the Grub from within the os after it is booted.)  These commands are case-sensitive, too; so it may need to be “acpi=off” for your distro.  ACPI=off (or acpi=off) can be good for testing 2 or 3 boots of some distro you’re interested-in, to see how well it can perform on your particular machine.  But to use this on a regular basis—especially where you are often in want of a place to plug-in to an electrical outlet—it is going to result in NO automatic screen-dimming when you haven’t moved the mouse in awhile, and sleep/hibernate almost certainly won’t work, and there will be scant warning (if any) when your battery reaches the end of its charge.  Nor will you probably be able to get any accurate readings by looking at an on-screen  battery monitor.  But there are solutions to this.  Continue reading.
I have noticed that most people seem to find a place to plug-in their laptop or other mobile device.  Usually.  Even at airports.  But really, once you get to really experimenting seriously with a distro on some mobile dev, and you are thinking of making it your distro of choice, it would perhps pay you to try running it with some other, more nuanced boot-parameters, like perhaps “nolapci”—if that one is recommended for your distro.  You should experiment with different params, which affect the ACPI or ARM—whichever of these two your manufacturer installed.
Another thing that  I should mention, is that desktop Linux is less capable, generally, when it comes to stretching-out the supply of electricity in your battery—even if it’s not having a problem with ACPI.  This has been true for a long time.  I NEED TO INTERJECT HERE, HOWEVER, THAT A VERY GREAT DEAL OF TRACTION HAS BEEN MADE ON THIS ISSUE IN THE LATE 2000s, in Ubuntu and more-to-the-point distros based-on Ubuntu.  As well as several RPM/Red Hat-type Linux.  I guess another potential remedy might just be to take some of the money you saved by not having to pay for WINDOWS or MS Office and buying a better battery.  My netbook, for instance, came with a 3-cell battery, like most of ’em.  But I am trying to save-up for a 6-cell.  And frankly I’d still want to upgrade to a 6-cell, even if I were sticking with WINDOWS as my only OS.  It’ll be more convenient, and these modern 6-cellers go from almost empty to full charge in not much more time than the 3-cell ones, once you plug-in.
46. Entry 46:  UPDATE YOUR SYSTEM BEFORE YOU INSTALL A SOFTWARE:    Be sure to open and run the updates-Manager in your Linux, fully installing any updates that may be waiting, before you install any new software packages (apps).  This is important, though it may not hurt if you forget.  Note that Lighthouse Pup has its own updates-manager.  Other Puppies, as far as I can discern, do not have an updater, and you probably should not worry about this, because Puppy is rather different in the way it works than other Linux—though curiously this is not an impediment to us newbies—probably because of the simple way the developer layed-it-out.
Most runs of an updates manager in Linux do not require you to restart.  This has pretty much always been the case.  Because Linux writes-to-disk differently, most times when you run an update manager, it does not require you to re-boot, though sometimes it will.
47. Entry 47:  FLASH VIDEO AND MULTIMEDIA:    Most Linux do not come with multimedia codecs installed, for IP reasons which are a little too complex to get into here. You will probably have to install Flash-player, for example, in order to be able to watch most video without Linux crashing, or merely going into “panic-mode”.  Unless you are in Puppy-Wary Linux, or Linux Mint, or the Mariner version of Lighthouse-Puppy.  You can just check this first, by going to You Tube, and trying to watch pretty much any video.  If it works, then Flash is probably already installed.  I wouldn’t try this in Ubuntu, however, because no Ubuntu comes with this installed.  And it will probly cause the CPU, the fan, and then the os to start to “flip-out”.
The Big Three make it pretty easy to install multimedia functionalities.  Run Update-Manager, and update the system first.  If you are using Ubuntu, you open Software Center, and type “restricted extras” into the search-field.  There is also the link “ Click here to install the ubuntu-restricted-extras package” located at: https://help.ubuntu.com/community/RestrictedFormats .  Close any other open programs.  Install the package.  Now close Software Center.  Shutdown cold, count to twenty, and boot back-up, just for good measure.  Note that Ubuntu’s Restricted Extras package will equip the system with other needed softwares, other than just flash-player coding.  The Linux equivalent of JRE nowadays seems to be OpenJDK.
If you are in KNOPPIX or some Puppy variant (“Puplet”), one way is often to just go to You Tube in the default browser, and start watching a video.  It is likely you will then be preseted with an opportunity (one-time) to install Flash Player.  Take it.  The dialog that appears will often let you just follow the prompts to install it.  If these opportunities or methods fail, there are other means to add Flash functionality and other needed proprietary codecs, which you can download for free.  But they may be more cumbersome than the means I just elucidated.  Be mindful of entry 45.  Try to get the stuff from a repository first, and then some other reputable source, such as the maker’s webpage.  PPAs should probly be resorted to last.
48. Entry 48:  I’M NOT SURE WHAT AN “OGG” IS, BUT I DON’T LIKE THE SOUND OF IT.      If I were you, I don’t think I’d make a habit of saving any files to the .ogg format, because it seems to have just been one of those ideas that, well, wasn’t a * bad * idea, when it was on the drawing-board, so to speak; but it does not seem to have turned out exactly well.  I feel we do better to save music to mp3, mp4, &tc.
49. Entry 49:  “RIVER” CAN BE BETTER THAN “POD”:    If you’re looking to get away from WINDOWS, getting an iPod is probably not your best bet. While there are many Linux programs out there that interface well with the iPod (AmaroK, GtkPod, etc.), iPods aren’t ideal for Linux, and you’re probably better off getting an iRiver or a Sandisk player. They tend to work well with Linux (without helper applications) and support drag-n-drop. iRivers, too, supposedly support the Ogg format (not just MP3).
50. Entry 50:  USE A POST-INSTALL CHECKLIST:    Research and then use a post-install checklist, for your distro of choice, when you finally decide to make one of them your Linux os.  Many are available on the web.  Research them—read the comments at the bottom of the web-page, and make up your own mind. There is a very good series of these checklists at the site “How-to Forge”.  To arrive there, usually you can just Google the phrase:  “<name of your distro> perfect desktop”.  Or alternately “how to make <name of your distro> the perfect desktop”.  This will usually cause it to be the first or second result displayed.  The author provides very detailed instructions (with images) as to how one should undertake to complete one’s install.  Is it more work than setting-up WINDOWS?  Yes, it is:  but take into account that most people don’t bother to set-up WINDOWS properly, and many probly don’t know how.  Because the WINDOWS metaverse is in * some ways * actually inferior when it comes to documentation.  It’s supposed to be the other way around—according to what “they” say:  desktop Linux is always supposed to be less in documentation than WINDOWS.  And it * was *, for a long time.  But thanks to a big effort on the part of the Cannonical Ltd. and Fedora people—both the distros and the Community—desktop Linux documentation now often             * exceeds * that of WINDOWS in some areas.
And http://www.howtoforge.com/the-perfect-desktop-ubuntu-10.04-lucid-lynx is a good post-install checklist (or Google:  “ubuntu perfect desktop”; or “ubuntu perfect desktop how to forge”).  The HTF site (How To Forge) has a set of easy-to-follow, graphically-enhanced instructions on how to properly “complete” almost all of the major distros, and even separate instructions-sets for different releases/builds of your desktop Linux.  These instructions will take one of us noobs perhaps a morning to execute:  but it is well worth it.  These simple pages allow you to install needed codecs and similar coding which—though free to use—are patented/privately owned, and therefore cannot be bundled-in with your download of many desktop Linux.  The author tries to include everything you’d need to do to Ubuntu, Fedora, or what-have-you, to make it a full and usable replacement for a WINDOWS desktop install.  I myself have followed most of his recommendations, and I can only report positive results, where my own equipment is concerned.
51. Entry 51:  WHAT THE HECK IS THIS “ Command-Line/Console/TERMINAL” thing, AND HOW DO I GET TO IT IF I NEED TO?
In Ubuntu, you should be able to get to it from the Accessories menu, which is found at the upper-left of the Desktop, if you are using GNOME Desktop Environment, which is the usual (default) one in Ubuntu.  Another way is the keyboard command ctrl + ALT + T.  Another way is just to minimize or close any open windows or programs (or maybe better yet use the “view desktop” switch that is usually shown in the system-tray/”bottom-bar”), and, while looking at the empty desktop, left-click on it (pretty much anywhere) to make sure the computer knows you are not pointing it to some other location:  and then just type “GNOME-terminal” on the keyboard (without the quotes), and hit Enter.
In KNOPPIX (6-series) or Puppy Linux, there is usually an icon in the system-tray, or on the desktop.  Or it can be found by clicking on the “start” button, and moving the mouse-pointer through the menus.
An IMPORTANT CAVEAT I wiil insert here, is that you should understand and know the bare basics of command-line/Terminal commands BEFORE YOU USE ANY, and cross-check commands using different sources (webpages, books) to verify that somebody isn’t playing a nasty joke on you, by suggesting a malicious command—though I have never seen this.
Here is a link to a nice resource for us newcomers to use to begin familiarizing ourselves with the Terminal.  You needn’t read this whole set of webpages (which really amount to a free online e-book), but rather enough to help familiarize yourself.
http://linuxcommand.org/learning_the_shell.php
It may be tempting to blindly type “commands” you found on some web site, expecting that they will do the described task. However, this sometimes fails just because you have a newer version, slightly different hardware or another distribution. You could try to execute each “command” with the –help option first, and understand what it is supposed to do. Then it is usually very easy to fix various small problems (/dev/sda -> /dev/sdb and so on), achieving the desired goal.
Do not run rm -rf / or sudo rm -rf / unless you are seriously considering deleting all of your data.  Run the command ‘man rm’ for more info.  The “man” command (without the quotes) will give you the user-manual information for pretty much any program on the Linux system—provided the makers of the app or the distro included it in the files.  Most often it will be in there, and “man” will display it.  For example,
man vlc
and then hit Enter.
This may display the contents of the VLC media-player user-manual on your Terminal—assuming VLC is installed to your system in the first place.
Note that Linux man pages are of a more technical nature, generally, than most of us ordinary WINDOWS operators are used-to.
I think it’s the down-arrow key that pages down in Terminal.  I’m not sure, as I’ve never had to mess with the Terminal much.  That’s how far desktop Linux has come!  You can do 99% of configuration and the rest from your graphical desktop (GUI)—in other words, POINT-AND-CLICK!
In most distros, you can highlight stuff in the Terminal by dragging the mouse—either from right to left, or vice-versa—just like normal.
Other functions (like copy and paste) don’t work exactly the same way.  See the entry pertaining to “PASTING INTO AND OUT OF TERMINAL”.
Similarly, don’t create a file named ‘-rf’. If you run a command to delete all files in that directory it will parse the ‘-rf’ file as a command line option and delete all files in the subdirectories as well.
If you want or need to save some “man” instructions from a display (called a “print”) in Linux’s Terminal/command-line, one way is:
# PAGER=cat
# man less | col -b > less.txt

Another I guess would be (to save it as .pdf):
man -t awk | ps2pdf – awk.pdf
Notice that that long vertical bar after <less> and < man -t awk> is the symbol from your (full-size) computer keyboard known as the “pipe”.  It is found on the same key as the back-slash we use in Windows.  (So it’s that you hold down SHIFT and punch the “ \ “.  )
Other means are available online.
52. Entry 52:  STICK TO YOUR DISTRO’S REPOSITORIES, IF YOU WANT TO ADD A SOFTWARE: If you decide you need or want to add some software, try to install it from your repos first, from the default parts that are already enabled/”turned-on-and-activated-for-downloading”.  Run Update-Manager (unless you’re using a distro that doesn’t use an Updater, such as Puppy), and update the system first.  This is a feature WINDOWS does not enjoy—you can automatically update all the softwares on a Linux system with a few mouse-clicks.  It is recommended procedure in Linux that you update everything before you install an app/package.
Note that the GNOME people run a site which in effect is their own repository, for GNOME-oriented and the Gtk+ type of softwares often associated with the GNOME DE.  I think they even feature some non-free apps, if you’re at your wits-end and are ready to download some shareware.  If you are ready to do that, you’d better get up and make yourself a sandwich, and then come back and research it some, so that you are sure what you’re getting before you pay money for * anything * to do with a Linux desktop, IMHO.  Because certain things may actually be a good enough value to justify the outlay.  But other shareware/pay-for apps may not be.  At the time of this writing, I tend to think the ones that fall into the first category (i.e. worth the price) are only a very certain few, as there is usually an open-source equivalent that you  * can * get to work—if you perservere.  Depends on what it is.  My advice is that you use Linux for awhile, and learn more about it, before you’re willing to think about buying a program.  And I think I’d try installing from my default repos first—at least until I got my “sea legs”.
Don’t install software the WINDOWS way, and download some app with your web-browser, and then try to install it; this way of doing should NOT be your first thought, but rather more toward the end of your list of options.  Use your Package Manager instead, and go with its default configuration first.  Even if all graphical/point-n-click methods of obtaining a software package fail, you * CAN * download a package from the Terminal/command-line, with no browser at all.  And this is probably better.  But you will probably be able to install what an ordinary user would need (and then some) by point-and-click, via your Package Manager, or Ubuntu’s Software Center (if, of course, you are using Ubuntu, or for that matter certain Puppy Linux versions; Ubuntu’s software Center is also available in Pinguy Linux, and I think also in AriOS, and probly a few other distros).
If you need help, try Googling for a few recommendations.  There are probably at least a few other people out there, who wanted the same package, and who posted how they sucessfully got it.  So more than one method of getting it will be shown.  Use your sense, and make your own decision.  If you don’t get a good result, it can be undone without too much hassle, and you can try another way.  Trying Software Center, if in Ubuntu, and then Synaptic Package Manager (which is also available in Ubuntu), will usually be all you need.  Synaptic can usually be installed to most Debian-based Linux, if it is not already in the system somewhere.  It’s a pretty descent tool, and usually not that difficult to master, if you have much in the way of technology skills at all.
Some programs can also be “ported” to a Linux system from a USB key, or cd or DVD, even though they were downloaded with WINDOWS—though this would not be among our first options.   I only mention this to illustrate how versatile Linux can be.  I counsel that we “noobs” should be conservative, where it comes to installing anything—even on a Debian-based Linux.  Debian-based Linux distros are:  Ubuntu and its variants; then:  Linux Mint, KNOPPIX, PeppermintOS, Crunch Bang, SuperOS, Kanotix, MEPIS, Bodhi Linux, Easy Peasy, JoliOS (sort-of; this one more for tablets, netbooks, & smaller), and PinguyOS.  There are more:  you could look at the Wikipedia article “List of Linux Distributions”, which lists the Debian-based ones first—though this may be just because Debian comes before the other Linux “base-distros” in the alphabet.  Puppy Linux is not based on Debian, but the way it’s built * nowadays *, it might as well be.  Trying distros that aren’t Debian-based is fine, but I am a Debian Chauvinist when it comes to us beginners, and I don’t apologize for that.  I just feel having the Debian structure underneath makes certain things easier for a newbie.
Know what the app is, what it’s supposed to do, and where it came from.  A little googling to confirm (or to bust) what your friend Barb told you about such-and-such free Linux software download will potentially save you a headache later.  In any case, stuff generally has a harder time affecting your system, just because you downloaded it in Linux:  more often than in WINDOWS, it would have to be actually installed to a directory to have any affect at all.  In WINDOWS, by contrast, lately you can get a virus just by visiting a webpage, whether you download anything or not.
Stuff is pretty easy to un-install, as long as you have not forgotten/lost your Primary password.  That’s usually the one you log-in with anyhow, so you probly won’t forget it.  You should not continue to use the same login/Primary/root-account passwords indefinately, though.  If you are really gonna do things “by-the-book”, you should change these every now and then.  But a lot of people don’t.  There ain’t no ready-made way to set Restore-Points, though.  Not of which I am aware.  Go back and look at the sub-entry on “DRIVER ROLLBACK”, under the entry “MY BIG THREE”, if you need.   Free programs like Flyback or MondoRescue GPL/free Edition are about as close as I can find to WINDOWS System Restore, and utilities such as these two can stand-in for WINDOWS Backup Center, too.  Also Deja Dup (pronounced “Day-zhah-Doop”) is a good one to check-out.
You do not need to hassle with de-fragging, creating a Restore-Point before installing a new software, or hassle with re-starting afterward.  This is because Linux writes-to-disk differently than WINDOWS, and Linux stuff is not infested with malware and spyware, generally speaking.  And this latter is probably largely owed to the fact that the Linux world operates on a different socio-economic/developer-pay model, and is much less “fee-for-product” than the Windows/DOS/IBM world.  Further, as I may have already mentioned, a big chunk of the Linux metaverse gets funding from South African billionaire philanthropist/entrepreneur Mark Shuttleworth.  And Linux developers are more in the game for their own repute and stature within a world-wide   * Community *, and so are treated like rock-stars and demi-gods when they pull-off something extraordinary.  As opposed to just being admired for having made (way too much ?) money.
Try installing a software from your package-manager first (or better yet from Software Center, if in Ubuntu), and from its default settings.  If you’re in Ubuntu, know that installing a software from non-default parts of the repository-structure (like “backports”, “proposed”, “universe”, “multiverse”, “PPA”, &tc.) may mess-up your dependencies library arrangement, and cause your Ubuntu to become buggier or unstable.  Or I guess it could even install something worse.  So if you just * have * to install something from one of these “extra” repos, you’d better be careful, and research it first.  But I can hardly see one of us ordinary users needing to add something that is not included in the 15,000 + apps available, from the default/distro-maker’s sanctioned repos.
If you need some version of a program that is not in the default configuration of the repository of your distro, you probly ought to set your mind to researching it on the web, and find a way of getting it (even from non-defult PM settings) that users have given a “thumbs-up” to, and which has been approved by your distro’s makers.  Get some user reviews.  Compiling from source is a way many of us get newer versions of some app.  And it’s not as hard as it sounds:  much of it is “automated”.
Yes, Linux is touted as an “open” system.  But that does not mean that us noobs should make our Linux installs unstable.  It might be wise to get some experience with the Linux system, before taking-on riskier operations.
Try not to uninstall any of your default apps, that your distro came-with.  The dependencies (sort of like .dll files in WINDOWS—sort of, but not exactly) may be shared with other programs on your system, which you need to make it work right.  So I want to emphasize that, if you want to customize your Linux, the recommended strategem is to start with some Out-Of-The-Box configuration (distro) that you download and then verify (checksum), and then “BUILD UPWARD”, adding stuff and preferences, running the updates-manager before each change.  [But you don’t have to re-boot!  That’s another nice thing about Linux!  You usually aren’t required to re-boot after installing a new software!  Just try to remember to run the Updates Manager (if your distro uses one) before adding each app.  If you maintain the system at all, it won’t take the Updater long to run.  Often, there will be no updates waiting!]  They say you should not uninstall [Remove] much of anything that was present when you first booted the distro, as its dependencies may be needed.
NOTE that the Ubuntu Software Center in later versions (after 10.04 LTS) is much improved, and you can get user-reviews right from the program, easily.  I still prefer to research it a little, online, if it’s not some big, well-known app (VLC media player, FireFox, Google Earth) I’m about to install.  But this is a nice new feature, and it adds to the usefulness of Ubuntu’s Software Center.  The Software Center has been improved in several other ways, also.  It is now even more “graphical”, and operates more like the apps you see on all those smart-phones, which are very popular right now.  As with several of the customized graphical utilities being built by the Ubuntu people of late, this improved version of the Software Center is activly being ported to other Linux distros, with Ubuntu/Cannonical’s blessing.  (Ubuntu is open-source, remember?)
It is possible to run many softwares built for a  WINDOWS-environ in Linux, using a “compatibility-layer”, like the Wine program.  But I really do not find that it works that well.  To my mind, the Wine utility is rather a last-resort.  It is better to just search for a native, genuine Linux program.  The search-engine in Ubuntu Software Center is a good place to begin.  There is also a file in my data-base here, pertaining to this.  Your package-manager will allow you to install a software package (app) from the repositories associated with your distro and its proctors.  So you know it was probably coded properly, and your package-manager may be able to automatically resolve any dependencies (libraries) issues for you, without you having to do any work.  Or at least it can often warn you that you will need additional input to your library files, so that the new app has a critical piece of needed routine.
If you can’t install something with your package-manager, then you can try other methods.  Some of the MAJOR apps have their own installer, coded for Linux.  That’s how I got Dropbox for my Ubuntu (10.04) install.  Files with the extension .deb are arguably the easiest to install, and are similar to .exe files in WINDOWS.  These are intended for Debian-based Linux systems, such as Ubuntu, Crunch Bang, Linux Mint, or KNOPPIX.  Puppy Linux is not Debian-based, but can install .deb files.  If you want something and decide to download it as a .deb file, do so through your sanctioned repos or else be careful what source you get it from, because these type of Linux files essentially install themselves.  If one of these were to contain some malware (rare in Linux), it’d be in your system before you knew it.
Otherwise, you may have to do it from a console (Terminal, command-line).  This is not as nasty as it sounds.  If you’re not careful, you may even get to like it.  At some point, you may have to learn how to de-compress a .tar file, and install it to the right directory.  Because there ain’t no installer-shield “wizards” in UNIX, dude:  not once you move beyond Ubuntu’s Software Center/your Package Manager.  Go much beyond these, and * You * are the installer-wizard.  But it’s actually not so hard.  [Really, Software Center, Synaptic Package Manager, gDebi Installer, Archive Manager, and others (often included in the os by default—otherwise, these can be downloaded, free of charge) are easier than many of the installer wizards in the “Windows metaverse”—or so is my experience; all I just named are GUI—”point-and-click”.  And, come to think of it, there are quite a few wizards in Puppy Linux—though most of these seem to be for configuration of various things.]  And just think of the fact that you aren’t gonna get any of those pesky unwanted tool-bars with a Linux software.  And who knows what other stuff (spyware? malware?) those wizards people use in Windows can embed in a WINDOWS system.  Even installing a Linux software by means of a .tar file is a straightforward three-step process.  And it is not that hard to learn.
There are several files-types you can install to a Debian-based Linux sytem.  Some of these will not install to a non-Debian Linux—or at least that might have been the case in the past.  Apps that come as the following files-types can be installed to a Linux distro:  .deb, .rpm, .bin, .tar.gz, INSTALL, .rar, and .sh.
If it was me, I think I’d go with .deb first, if I could find my desired app in that format, assuming of course that I already tried to install it the point-and-click way, with Software Center, or my Package Manager.
My next choice would be .tar, which Ubuntu has graphical utilities to handle, though you’d better read the directions first.
Finally, there is .bin, which just stands for “binary”, and doing it this way is what they often mean by “compiling from source”.  More about this in other documents.
I don’t think I’d mess with the other formats I mentioned—.bin, .tar.gz, .tar.bz, .tar.bz2, INSTALL, .rar, and .sh—not as a new user, anyway.
While we’re on the subject, I do not much care for the various scripting, “butler-like”, “helper” utilities that can be added—not even Ubuntu Tweak.  Ultamatix, for example, has some significantly negative feedback on the web.  This  may be no more than a bunch of hooey:  but why should us noobs leave ourselves open to finding out the hard way?  To my mind, it is about as easy to just do your own “tweaks”, and much safer for a noob than if some script you didn’t write is doin’ it on your behalf.  And you’ll learn necessary things for you to know in order to continue using Linux, by just doing the necessary post-install adjustments yourself.

53. Entry 53:  KNOW HOW TO SHUTDOWN LINUX:
Ideally, Linux should only be shutdown in the prescribed way, by using mouse or Terminal commands.  A “hard-reset”—shutting-down by holding your finger on the power-switch, is not desirable:  it is said that it can damage the Linux kernel, and require it to be re-installed, though I have not seen this happen in my (limited) experience.
Normally, we shutdown desktop Linux with the mouse, by clicking the shutdown option where it is presented in the menus.  In Ubuntu 10.10 and earlier, I think it is shown when you click one of the links on the far upper-right of the desktop.  Try clicking on you username, up there.
In Puppy and KNOPPIX 6-x, there is more of a traditional, “XP-style” Start-button.
If you’re on a lightweight desktop environ (for instance Cruch Bang uses Openbox—or else Fluxbox, or maybe certain set-ups of LXDE), you may have to RIGHT-click anywhere on the desktop in neutral space, to get the start menu.  This is very easy, though, which you’ll see the first time you do it.
Various commands can enable you to shutdown Linux from the terminal.  In Ubuntu, usually type-in “sudo halt”and then hit Enter.  You may be prompted for your (user) password.  In Puppy, try “shutdown”, or perhaps “shutdown -h” and then hit Enter.
I don’t think most Linux distros use alt + ctrl + delete (the WINDOWS way to get to Task Manager to re-boot).  It’s worth a try, though, as some of ‘em do have this feature.  [This feature exists in Ubuntu, but must be enabled aftwer you install it/begin using it, so that it is “turned on”.  Search in this database, or online.]  AT LEAST AS IMPORTANT to a Linux user, there is ctrl + alt + F1 (in Puppy Linux I think it’s ctrl + alt + Backspace), which will often work to take you to an X-window, and you can re-vamp things from there, with command-line commands—provided you know some.  (Or you can just shutdown from there.  I find “sudo halt” works from this environ in Ubuntu 10.04.  Some distros may still require “shutdown -h now”.  There are also other commands that do the same thing, or related.)  I will try to compile a list of the most important commands, as soon as time permits.  Until then, there is Google.  Or my document “linux common admin commands”.  Or http://www.omgubuntu.co.uk/2011/07/top-terminal-commands-newbie/ .  You should probly read the comments too.
NOTE that there is also the “magic sys-req keys”, a.k.a. R.E.I.S.U.B., which may not work in some distros unless the setting is changed by you, after the install.  Sometimes it fails to work anyway, and your only option is a “hard reset”, which means holding-down the power-switch on the computer’s  case with your finger until the dern thing finally turns-off.
Google “enabling magic sys-req keys in Linux”.  This is a feature that (supposedly) comes already enabled in most releases of Ubuntu.  I could never get it to work for me, in 9.10 or 10.04.  But I found that it did work in my install of Linux Mint Fluxbox 10.  View the document “l ubuntu emer commands n shutdown” in my database.  This is another method to safely shutdown Linux and re-boot.
It is also true that you can (and probably should) enable the feature which lets you use just a quick tap of your power-switch to signal the Linux kernel to shutdown the machine as best it can.  This is desirable from the standpoint of avoiding improper shutdown due to a frozen GUI.
In Ubuntu and most of its variants, this seems to be able to be set from the menus.  I believe you go to System > preferences > Power Manegement > General tab,
Switch the option “when the power button is pressed” to switch off.
THIS IS ANOTHER SETTINGS-CHANGE TO ADD TO YOUR POST-INSTALL CHECKLIST—especially where it comes to running your Linux on a laptop or mobile device.  This setting may already be enabled, however in certain other distros of Linux, other than Ubuntu.  Consult you distro’s manual, or other instructions I have posted here, or that you can find.
Be smart, and print at least some of this out before you start using Linux.  Or find a way to have access to at least some of these instructions from outside of your computer beforehand, so that you will be  able to consult them if you temporarily get into trouble because of unfamiliarity, or some hardware compatibility issues that need sorting.
For more in-depth, play-by-play information, you might look at the file “l j s shutting down karmic koala”.
54. Entry 54:  Most of your major keyboard commands that you are familiar with from WINDOWS work just the same way in most Linux.  Ctrl + F, ctrl + c, ctrl + v, ctrl + x, ctrl + z, &tc.  See the entry pertaining to PASTING INTO AND OUT OF A TERMINAL.
55. Entry 55:  SUPPORT:  THERE’S LIVE HELP:    Be aware that Ubuntu has its own free live chat-room help-channel on the web (IRC), if you have a problem to which you can’t find the solution by Googling.  You should try to answer it yourself first, though, by searching the web.  It is open 24/7, and staffed by volunteers.  Try #Ubuntu on IRC.
I can currently get-on at this link:
http://irc.netsplit.de/channels/details.php?room=%23ubuntu&net=freenode
Just choose a username for the duration of your chat-session, and click “continue”.  If there are a lot of ppl on there (which is usual), then you’ll have to be as patient as you would be if you were going onto, say, Computer Hope live, in order to get some free advice about Windows.
Keep typing your question every few minutes, re-phrased in a different way (so the proctor won’t think you’re double-posting); if the room is just too busy, just hang-up and try again later.  Remeber—these people are all volunteers, and they are not beibg paid to be on there.  They’re doing you a favor, just by being possibly available to help you with Linux, which you probly got for free.  [This is no different than any one of a few IM sites that help people with ms Windows:  free, staffed by volunteers, sometimes crowded.]  Don’t abuse this nice free resource.  And try to (plainly but politely) make it clear in your question, that you’ve tried and tried to help yourself first, and answer the question by Googling—but you have not been able to get an answer you can understand/use.
For further information:  https://help.ubuntu.com/community/InternetRelayChat
56. Entry 56:  SUPPORT:  You can obtain help for almost any particular program or distro in the irc server irc.freenode.net (example: #debian, #ubuntu, #python, #FireFox, etc). You can find user communities also in irc.freenode.net.  A link I currently use for Ubuntu’s live-chat help-channel is:
http://irc.netsplit.de/channels/details.php?room=%23ubuntu&net=freenode
57. Entry 57:  SUPPORT:  EVERY MAJOR LINUX DISTRO HAS A USER-MANUAL.  These can be found online.  I have posted some manuals which were compiled by me, initially for my own use.  There is also a “Help Program” available to you the user, usually by pressing F1.  Google is also a way to find-out answers to questions, and solutions to problems.
58. Entry 58:  SUPPORT:  LINUX MAN-PAGES:  These are manuals-pages (hence “MAN”) attatched to pretty much each program/package that is installed to any Linux system, and these were usualy written by the same people who created the particular app/program.  You read them from the console/Terminal (a.k.a. “command-line”), by typing “man” (without the quotation marks), and then the name of the app, and then hit Enter.  Tapping your up or down arrow key will often allow you to scroll.  MAN-pages may also be available on the web.
A link to SuperMan Pages online is :   http://linuxcommand.org/superman_pages.php
I must add that much of what is offered here is * old * ; much can still be learned from it, though.
59. Entry 59:  SUPPORT:  FORUMS ARE VERY USEFUL:    The major distros all have their own dedicated forums-sites, where you can post a problem or question, and check back later for replies.  THIS IS A VERY IMPORTANT FEATURE FOR US NOOBS.  Be civil.  Be polite.  Do not be glib or snarky.  Try to work it into your post, that you have tried repeatedly to solve the issue on your own first, and just can’t find the answer.  A great many users with more experience that either you or I watch these forums, and will try to help a person who seems genuine.  There are also independent (and sometimes ad-supported) forums that deal exclusively with Linux difficulties.  Google it.
Ubuntu forums have many gradations, according to one’s level of involvement in the distro.  There is a section for absolute beginners, and a section toward the other end of the spectrum for advanced users.
The people who try to help others on Ubuntu forums are usually pretty knowledgeable, and, like Ubuntu itself, they try to help anybody who needs it—they “take all comers”.  Even if you get mad and quit, if in time you decide to return, it is often as though you never left.  Ubuntu is sometimes criticized as “suffering from its strengths as well as its weak-points”:  it tries to be a universal desktop, and “to be all things to all people”.  To some extent these criticisms are true.  However, I’d like to point-out something else, where it comes to Linux forums in-general:  if some of us get a bit annoyed sometimes, with those of you who are in want of help, it is as often as not for reasons other than you’d think.  Often as not, it’s just out of frustration, and not at you as the real object.  Rather, we’re frustrated at the fact that there is so much mis-information out there, where it comes to Linux, and the seeming fact that people trying it often seem to just be obsessed with its deficiencies, which frankly are relatively few.  For those of us who use it daily (and who feel we probly learned to use it just through downloading and messing-with it, and “sheer determination”), well, we sometimes take certain questions as unfair criticism of a distro we have come to love.  And our memories of how much difficulty we may have had learning it have been pushed-down deep in our minds—by work, by spouses, family, Kim Kardashian, and perhaps the looming prospect of foreign war—in short the innumerable distractions of postmodern life.  And then there’s A.D.D.
So what often seems like occasional testiness from the Linux community is often no more than the misinterpretation of an innocent question on our part.  It’s not usually our fault, any more than it is yours; it’s just a reflection of human limitations in the face of technology, and using machines that can calculate the hypotenuse an angle of an asteroid part-way between the centre of the earth and that of the moon, and in under twenty seconds.
60. Entry 60:  SUPPORT:  UBUNTU SUPPORT RESOURCES
[ Some of the links in this entry may be old (which is to say their webpage may have been moved or taken-down).  I have tried to update it.  People will post at the bottom. ]
GNOME Project
GNOME is the default desktop in Ubuntu (through release 10.10), and a list of GNOME projects is available.
Ubuntu Screenshots and Screencasts
What is Ubuntu?
Ubuntu 10.04 3D Desktop and other YouTube videos.
New Applications Resources
GetDeb – Features the latest versions of software available from the official repositories as well as software not available in the official repositories. Available in easy-to-install .deb files (see Apt and Package Basics).
Top 100 Open source Applications
Linux Alternatives
See our full list of add-on applications.
Other *buntu guides and help manuals
Kubuntuguide
Lubuntu — Lubuntu can run with as little as 256 Mb RAM. It is better for older machines with limited resources.
official Ubuntu Server Guide — a good starting reference for server packages
Ubuntu Doctors Guild — a collection of tips for using (K)ubuntu Linux in health care environments
SkoleLinux — a collection of (open-source) educational tools for Debian/Ubuntu Linux

61. Entry 61:  SUPPORT:  Cannonical, Ltd., the company that develops Ubuntu, has pay-for support plans available, for home-users and for business-users, if you’re interested.  Just Google it.
62. Entry 62:  COPYING AND PASTING INTO AND OUT-OF A TERMINAL WINDOW:
Copying and pasting into and out-of a terminal window does not work using the traditional keyboard commands.  operating System commands or other data should never be pasted into the Terminal from an ordinary document or word-processing shell anyway.  Us newbies should always use a GUI text-editor for this.  Just look for a desk-top note-taker app that knowledgeable Linux ppl tell you is also acceptable for the purpose of editing text to be pasted into  a command-line.  Often you will then be able to find it in your menus, because most builds come with one.  You do not want something that puts hidden formattings that you don’t see with your eye into the Linux Terminal, because this has a way of “confusing” the Terminal-programming—and that is what normal documents-creator programs like Word, Works, Abiword, or OpenOffice Writer will put-in.  Two common Linux apps that are acceptable to use as GUI text-editors for pasting into the terminal are 1) G-edit, and 2) Geany.  There are more available, out there in the Linux metaverse, for free download.
In order to paste a line of text or code into the Linux terminal, highlight it, ctrl + c to place it on the clipboard; then click the mouse in the terminal, watch where the cursor-bar is flashing—that’s where the paste will start-from; then ctrl + SHIFT + v.  This usually works.  There are other methods detailed online, or in this data-base.
To copy from the terminal, drag the mouse in the Terminal to highlight what you want; then ctrl + SHIFT + c.  Or I think you can try clicking the mouse-wheel.  The wheel of your mouse—if you push-down on it hard, and then immediately release it, will produce a special, “middle-click”, unless it is a very old or very cheap mouse.
63. Entry 63:  NO DEFRAGGING NEEDED.  No disk-defragmenting is done in UNIX/GNU/Linux.  Because UNIX is structurally different, no such service or operation is necessary to maintain the integrity of files, or make it easier for the system to find a file.
64. Entry 64:  ABIWORD IS A FINE PROGRAM, BUT…    Abi-word can be very buggy.  Restarting it a few times often works.  If it did not ship with the distro, but you installed it later, it can be difficult to get it to work at all.  This is usually a maatter of the novice operator not using correct install proceedures.  The proper methods of installing it may be touchy in some distros.  It is a very nice program, especially for those of us who like to write.  If you are going to use this one, make sure it is installed in the proper way, for your particular Linux distro.  If it came with the distro, already installed, restarting the program and then trying to use it a few times may be effective.  I do not believe AbiWord is RAM-intensive.  operator error in installation of this one is a big reason some ppl have difficulty with it—they were cavalier as to installing it.
65. Entry 65:  A CLIPBOARD WORK-AROUND:       Sometimes the “paste-special”, or “paste as unformatted text” feature in Open Office Writer/Libre Office Writer remains greyed-out, even though you have an amount of information copied to the computer’s clipboard.  A workaround is to simply ctrl + v, pasting the text as formatted; then, highlight it with the mouse, ctrl + x, and then try again to paste it with the ribbon-icon.  Deleting the side-bar comments from .doc s into which I have pasted formatted stuff in OO Writer also speeds things up a great deal, and I’ve never had occasion to use the “comments” for anything anyway.  Just horizontally scroll to any one of the comments, then click the little drop-down arrow it has with it, and click “delete all comments”.
66. Entry 66:  FILENAMES:    As to naming files, for compatibility on Linux and dos-based systems:
NO FILE that you create that is going to be used in Linux should begin with the name of an Application.  This is probably over-kill.  It’s your system.  You decide.  A file pertaining to how a user should operate Firefox (I mean some directions/instructions) should probably not be called “Firefox ~ blah blah”, at least to my way of thinking.  You might try something like “f-firefox ~”.  I prefer to use all lower-case for file-names, with a space between words, and without underscores or hyphens.  Because this will make it more acceptable to WINDOWS, too.  And you will perhaps need to send information containing the name of one of your files to a friend who uses WINDOWS, or even to one of your own computers, at some time in the future.
It is also true that * no * operating system (Windows included) really tolerates an empty space between words—their graphical, point-and-click files-managers and other shells just ‘pretend” to do so.  So I guess it’s up to judgment.  You might want to use something like “my-sales-report-file” instead of “my sales report file”.  But my own experience is that either will work.  * Until * you need to work with something in a Terminal, or otherwise outside of your usual graphical environment.  Which I have found I almost never need to do, as most modern desktop Linux distros just don’t require it.  If this latter becomes necessary, just rename “my sales report file 1” to “my-sales-report-file-1”, using Nautilus or whatever file-manager, and then go back to the Terminal.  A lot of Linux users recommend use of the underscore instead of the hyphen (“my_sales_report_file”), to separate words in a filename.  Well, fine, but you may find that if you need to open the same file on a Windows computer some time in the future, Windows doesn’t like this.  Windows doesn’t especially care for the hyphen in a filename, either:  but it seems to balk less at this, than the underscore.
So I continue to use lower-case filenames, with a space between each word.  On the rare occasion I need to do something with one in a Terminal (because I’m a bit of an experimenter, remember?), then I just use my file manager to right-click it and re-name it, to something like “lukes-music-file-1”, or “wandas-news-clipping-1”.  WINDOWS will open these, it seems.
Truncating/abbreviating long words is often something I do, but is probably not necessary.  It is generally recommended that we try not to make filenames too long.  Some of my own files have names that are * very * long indeed, though—and I have yet to have an issue.
I have noticed that DIRECTORY (Folder) names may be somewhat of a different story.
I find that in Ubuntu 10.04 (.2), the best policy seems to be:
When creating a new folder, make the first letter of the name a Capital letter, and use lower-case for the rest of it.
If you just bloody well can’t stop yourself from havin’ to have a space between words in a long Directory name, type an underscore (_) instead of hittin’ that space-bar.  Or use a hyphen.  Microsoft Windows seems willing to open a Folder with either of these special characters already included in the folder name—like for example a folder you imported from Ubuntu or Linux Mint, et al, via DropBox, or with other software.
Of course, avoid “weirdo’ characters, like &, %, @, and so-on.
So the name of a directory I create in Ubuntu 10.04 looks like this:
Myvacationpictures
or :
Lukes_new_soup_and_stew_recipes
NOTE that I did not try to use an apostrophe (‘) in my name, to indicate possession as we would in normal English.
NOTE that you rename a directory in Ubuntu the same way you would in WINDOWS:
Go to your Home directory, which will of course display all the folders of your personal user-space (no systems-files:  you gotta navigate to them a bit differently).  Now RIGHT-click on the file, and a context menu should pop-up.  Click “Rename”.  Click the mouse-cursor in the filename somewhere.  Now you can click wherever you need to in the filename, and use the Backspace key to erase characters, and re-type the characters you want.  When you’re done, you hit Enter, and it should save the change.  BUT REMEBER THAT YOU SHOULD BEGIN THE NAME OF A FOLDER WITH A CAPITAL LETTER, FOR LINUX-CONVENIENCE.
So, “Lukes-music-stash”, NOT “lukes-music-stash”.
If you ever have to work with folder (“directory”) stuff the non-graphical, non-GUI, non-point-and-click way (which I doubt), note that while Windows uses backslash (“ \ “) to separate folders, Linux is the opposite, always using “ / “.
And the Linux master (Root) folder * * does * * not * * have * * its * * own * * symbol * *, so it’s not represented.   Linux Terminal windows often make use of the “#” symbol * at * the * command * prompt * to signify an instance of the Terminal windows having been invoked with Root (direct Administrator) powers:  but this is a situation a normal desktop computer user is rather unlikely to encounter in Ubuntu and most of its variants, as Ubuntu is built to run wholly from a * User * account, and to enable any Administrator-level operations (rare, for a normal user) from the “sudo” program in Terminal.
It is assumed by the Linux user that “Everything grows out of the Root directory, ultimately”.  Which it does.  So in a file path that is:     “/flyingfisherman/home/Downloads/crunchbang-instructions-1”, the True Root folder is just represented by the “nothing” to the left of the first /.  And you don’t even have to tap the spacebar on your keyboard, because that “space” is just “assumed”.  You will, sometimes, though, see commands posted on various forums stated as something like “# update-grub”.  Like perhaps on Fedora Forums.  This is because Fedora will let you log-on as true Root (and I think MEPIS-Linux will also).  In order to understand the same command in terms of Ubuntu + its variants, make it “$ sudo update-grub”.
In most other context (other than the actual command-prompt), the # symbol means “comment”, as in the (parenthetical) “tips” writers of actual computer-code include in the source-code, to warn each other of possible issues, or to explain what the previous line of code was supposed to do.  And in certain other contexts, # can be used for other things:  but just knowing the meaning of the # symbol in relation to command-prompt on forums-posts will be all the average, normal user will probably ever need to understand about this symbol.
67. Entry 67:  SAFE MODE AND SYSTEM-RESTORE ARE NOT NEEDED IN LINUX There is no * ready-made * Safe Mode, or System-Restore feature in Linux.  [If you want something like Windows’ restore-points, though, I guess you could check-out BackInTime-for-Linux.  There is also Flyback, and others].  Linux has no OEM product-key either.  No defragmenting.  These are not needed.  You do not have to pay for Linux.  The original live-cd from which you installed your current version serves as your backup disc.  You backup data added to the system with removable media, and with cloud-services.  You can also backup settings and customizations to these, and restore them in the event of a re-format—though the necessity of re-formatting is rather rare in the case of a home-user of desktop Linux.  There are also numerous Linux alternatives to programs like Microsoft SyncToy, such as FreeFileSync, Grsync, Meld for Linux, and Deja Dup, which can copy and/or sync whole directories (folders), and (depending on which one you use), even all the sub-folders under those.
Anti-virus/anti-malware programs are useful to run in Linux, mostly to scan a file you intend to pass-on to someone with a WINDOWS computer.  (You wouldn’t like to send your best friend some e-mail attatchment that contained a WINDOWS virus which you didn’t know was in there.  So scanning it first with ClamAV or AVG-free for Linux-systems makes some sense, doesn’t it.)  The free versions of these are perfectly adequate—you don’t have to pay for anything.  Just RIGHT-click on the file or e-mail attachment, and the context menu that opens should give you the option to scan with whatever anti-virus you have installed.
Free programs like BackInTime, Flyback or MondoRescue GPL/free Edition are about as close as I can find to WINDOWS System Restore, and utilities such as these two can stand-in for WINDOWS Backup Center, too.  These two work very similarly to Apple/MAC’s Time Machine—especially BackInTime.  Also Deja Dup (pronounced “Day-zhah-Doop”) is a good one to check-out.
There do not seem to be any effective Linux viruses “in the wild” (read that “outside of the confines of a computer-science laboratory somewhere”).  At least at the time of this writing.  Nor do there seem to have been any for many, many years.  There are root-kits, but these seem mainly targeted at servers.  There will be entries as to easily making Linux even more secure, later in this document.
If Linux does have some major problem (rare, if you have followed the advice above, in the foregoing entries), Linux can be re-installed, or another distro can be installed, or it can simply be wiped from the hdd.  (Research how this is done—especially if you intend to do the latter).  If you would have been  able to have set-up your Linux with a separate root/Home partition, a re-install will be that much easier.  But this may be deeper water than a noob would want to wade-into, in his or her first dip in the pool.
If you really want to make a full backup of an install of Linux, you can create a ghost-image with RemasterSys, P.I.N.G., Clonezilla, or others, which you can download for free; or, you can use any one of several utilities/discs that you can d/l free, or that you can buy.  Acronis True Image is one that you can pay-for, and which many Linux server-administrators seem to like—at least at the time of this writing.  I don’t much care for Norton Ghost anymore—unless maybe you can find a version that was built before somewhere around 2002.  When the Norton folks merged the original Ghost utility with that other program they acquired in a merger (its name escapes me at the moment), that seems to be about the time Ghost started to lose its mojo.  Too bad.  I’ve never felt the need to clone one of my drives—probably because I’m just a desktop-user, and I don’t have that many settings and softwares I’d have to manually restore, anyway.  But if I did, I think I’d probably try RemasterSys first, based-on what research I did do.
68. Entry 68:  LINUX COUNTS FROM ZERO, NOT FROM ONE:    Try to remember that Linux counts things (like interfaces) from 0, if it can.  That’s why Eth0 is Eth0—and not “Eth1”.  Eth0 is “Eth” (ethernet) + 0 (a zero).  The zero (0) is thinner than the capital letter “O”.  That’s the principal way to tell the difference between “0” and “O”.  A strikethrough is also sometimes used to denote a zero.  Some computers are hooked-up to more than one Ethernet connection, and can switch back-and-forth; in these cases, Linux will display both Eth0, * and * Eth1.  (And perhaps also Eth2, Eth3.)  WiFi is different, but also in some ways similar, respective of Ethernet.  Look elsewhere in this data-base for information pertaining to wireless connectivity.
This way of counting is evident in the GRUB utility for bootloading, also.  You might wanna try and remember this, in case you ever find yourself editing certain Grub configurations files—though you probly won’t need to get this deep into stuff, just to use desktop Linux.  But if you did, you’d notice that the partitions on your harddisk might be understood by Grub as a digit lower than the same ones as shown in G-Parted.
69. Entry 69:  LINUX IS CASE-SENSITIVE.  (Except when it isn’t case-sensitive.)  Unlike Windoze, Linux cares how you enter a search.  In Ubuntu (at least), the  graphical ctrl + F -“find”- command usually looks for stuff with BOTH capitals and lower-case spelling; and will locate it for you regardless of how it was put in the machine—just like in Windoze.  But not so in a Terminal.  In the Terminal—at least—you gotta enter the info in the same case as the file you wanna work-with, or it won’t know what you’re telling it about.  So to navigate to the Downloads directory (in Terminal), you gotta type “cd Downloads”, and not “cd downloads”, or you’ll get an error message.  When searching for something in a commandline or Terminal, there are “options” (“arguments”, “commandline switches”) listed on the web, which can easily cause the system to (temporarily) ignore lower case-UPPER case distinctions.  I think a major one is     -iname     , but you’d better check some authoritative instructions, for your distro.  Most UNIX commands work, generically, across distros—or so is my limited experience.
70. Entry 70:  INSTALLING LINUX:  I don’t recommend installing one of my “big three” to your harddrive, unless it’s Ubuntu or its variant Linux Mint, or maybe one of the other Ubuntu variants.  Maybe not even then—at least not initially:  Ubuntu and its variants can be run from a thumb-key/flash-drive.  I am inclined towards flash-drive installs for us noobs (or, perhaps better yet, an external harddrive).  KNOPPIX and the Puppies-Linux were built to run as a “live-environment”, and this from Day-1:  so these are/were not really intended to be installed to the harddrive.  Another worthy option is an external, add-on harddrive, or a micro-drive (mini-external-harddrive).  Remember that you must back-up all data before you do any “radical” procedure to your system—like work on the harddrive.  Because nothing is perfect.  (Skip-down a couple paragraphs, if you want.)  Many builds of KNOPPIX to-day have an “installer”—of sorts—for the purpose of harddrive installs.  But really, KNOPPIX is not intended for a traditional harddrive install, even to-day.  KNOPPIX is organized to run as a “live file-system”, and so does well when run from a cd or USB-thumb key, with or without persistence configured.  It is also true that a lot of people run Linux as a “virtual machine”, using virtualization software, such as VMWare.
Ubuntu and its variants (such as Linux Mint) use an installer-program, coded especially for installing them to your harddrive:  the name of this “Ubuntu installer” is Ubiquity.  This “wizard” asks specific questions, and where you aren’t sure, it is usually fine to just go with the default answer.  But these will add a partition to the harddrive, to which they point themselves by default.  Note that this is aimed-at installing the Linux * alongside * your WINDOWS operating system, by shrinking the harddrive space on which WINDOWS sits by a certain amount:  but modern harddrives almost always have more space for WINDOWS than it needs.  You can find-out on the web how much space is recommended for your version of WINDOWS:  however, the Ubiquity program is very capable of doing the “side-by-side” type install “automagically”, and safely, re-sizing WINDOWS and making your Ubuntu partition, all the while not shrinking WINDOWS to too small a size.  This is important, because if you shrink the partition of any operating system to too small a size, it won’t be able to boot, and will have to be put through a recovery process or else deleted.  Ubiquity will also “automagically” create a (small) “Linux-swap” partition during a side-by-side install, which is necessary as the Linux equivalent of WINDOWS’ “virtual memory”.
They call this side-by-side type of install “dual-booting”, though you can only boot into and run one system at a time.  The Linux program * GRUB * will be installed to your computer, but will sit on the Ubuntu partition.  This will be the “chooser” program, that will  allow you to select either WINDOWS or Linux, just shortly after you punch that power-on switch to turn-on the computer.  One makes this choice by using the down (and also up) arrow keys on the computer’s keyboard.  The technical name for such a “chooser” program is “bootloader”.
This can be hard for a noob to undo—though many ppl have just gone back to WINDOWS if they decide Linux isn’t for them after all, and left one of these in place, which won’t hurt a thing.  But it will continue to occupy a significant chunk of space on the drive, though it does not take up as much as WINDOWS—unless you choose to do so when the installer shows you how much it is going to take:  it’s usually good to just go with this default, because the installer in these two distros is pretty smart; but it gives you the option to choose different sizes if you want to.  Harddrives to-day are a lot cheaper than they were, and much more spacious.  Remember not to dump your WINDOWS install:  even if you take to Linux like a duck to water, you may still need it, once-in-a-blue-moon.
Since * at least * version 9.10, Ubuntu’s Installer program (which you can activate from the live-cd) seems to have been much improved.  If you happen to  get to reading an old thread on the internet, and people are talkin’ about how they had to partition their harddrives themselves, and what a pain it was, and how they then had to copy the system files to the harddisk, &tc &tc, be sure and check the * dates * of these posts, as to * when * the authors posted them.  AND BE AWARE NOW, that Ubuntu’s Installer nowadays does all this * for * you, if you choose to install it to your system’s internal harddrive.  (Unless perhaps you choose “custom install” from the menu for some reason).  So you probly won’t have to run G-Parted yourself, or any of that crap.
I have installed Ubuntu 9.10 to the harddrives of three machines, and the first two went-off without a hitch, and each of these took approximately 90 minutes of my time.  (I currently use Ubuntu 10.04 LTS, BTW.)  The third install of 9.10 (to a neighbor’s custom-built machine) wasn’t so smooth; the option to install Ubuntu side-by-side with Windows (dual-boot arrangement) did not appear.  Nor did the machine seem to fully support Karmic Koala once it had installed, even though we had checked the basic hardware (CPU and RAM ) for capability.  I think I could have fixed it, with a little help from the owner of the metro-area computer shop who built it (back in c. 2006 is about when it was put-together).  He (the computer shop proprietor) talked like he would be glad to answer some questions about the machine with his label on it, to help me figure it out.  But Jim (my neighbor) lost patience at this point, and just had the guy re-format it with WINDOWS 7.  Linux is not for everybody.
Be prepared, though, if you are intent upon installing any Linux distro to a machine’s internal harddrive:  read the manuals, and the rest of what I (and others) have to say here, and make sure your battery is full if you’re gonna try this on a laptop.  At least  read-through the distro’s release notes once, even if it seems like you understand very little.  And be sensible, as to when you plan to undertake it, if you are gonna mess with the harddrive-method.  Don’t plan it for that morning when you know it’s gonna be your turn to pick-up the kids from soccer practice.  Because it might take you longer than 90 minutes, dude.  And for goddsakes have backed-up all your data first—like to removable media such as cds—and defrag WINDOWS:  then probably open Windows Updater and download and apply any updates that it can find.   And then create a resore-point.  These are easy things to do—especially in WINDOWS 7 or WINDOWS 7 Starter-build.
[The equivalents of these procedures, * where applicable *, are just as easy in most modern Linux, BTW.  You don’t have to defrag in Linux, for instance, because Linux has a better file-allocation and write-to-disk modus. ]
[ I don’t wanna sound like your naggin’ mum, but, have you gone to Windows’ Backup and Restore—like within a day (or maybe a few minutes) of getting your Windows conputer set-up—like * BEFORE * you went online with it, and created all your backup disks—like your backup boot-disc set that Windows * used * to come-with from the vendor, and nowadays often doesn’t anymore?  Or your Windows 7 repair disc(s) (which is a little different again, but also recommended by Microsoft); or your Administrator password reset disc?  And/or have you used Backup Center since then—like every Sunday night before bed—like before shutting-down the computer for the night?  You do shutdown your WINDOWS computer for the night, at least every once in a while, don’t you—like so that recently published updates can be applied?  And what about those WINDOWS 7 or Vista updates that are waiting to be added?  If you do not know what these things are, or at least have a rough idea of how to perform them, then you should Google them, and, once you have done them, return here, and pick-up where we left-off.]
What these last several paragraphs (pertaining to internal harddrive installs of Linux) add-up-to, is that I think I’d start-out in messin’ with Linux by usin’ USB thumb-key(s), if I wuz you.  This method of getting Linux is becoming increasingly popular anyway, and for several reasons I will touch-on in here.  It is good advice for us noobs—and maybe even for more experienced WINDOWS people.
Even if your machine meets the minimum RAM and CPU requirements for a distro, don’t expect every distro’s live cd to run properly on your system, or its install disk to work properly with it.  These things usually work; but once in a while they don’t, or they seem “shaky”.  If the installer is having problems installing a distro to hdd, back-out.
Really, it might be best to try a distro for which the *recommended * RAM and CPU requirements are at least a few “degrees” below the actual hardware resources of your machine.  It is also true that I have seen various desktop Linux run * just * fine * on equipment that is far below the recommended hardware requirements for that particular distro.  The conclusion I draw is that these * minimum * and * recommended * hardware specs are a * GENERAL * GUIDELINE * to performance.  That is one reason so many knowledgeable people on the various Linux forums use the disclaimer “Your mileage may vary”.  Because it does.
The situation is not all that different with WINDOWS XP, by the way.  Microsoft claims (or has said in the past) that XP will run on a machine with only 128 Mb. RAM installed.  But it’s more like “crawl”, than * “run” *.
Yes, the situation is arguably even murkier in Linux.  But this is probly to do with the major hardware vendors, than with Linux itself.  The big manufacturers in the pc markets have little incentive to co-operate with the Linux community at-large, when it comes to releasing some arcane details of their hardware architectures.  And much more incentive to have a working relationship with guess who.  And yet, some of them actually have gone out-of-their-way to “grease the tracks” for Linux—though not when it comes to every model and sub-series they produce.  Would that it were otherwise.  If you are willing to just try some different distros as live-cds, however, and do some amount of research, you may just find a Linux that will suit you.  This is just another bump in the road—one of several about which I said I’d be frank.  Fore-warned is fore-armed.  If it’s too much, there’s always a return to WINDOWS.  And those who depart from Linux oft times return, perhaps in a few months or years, when compatibilities may actually be * better *.
A Linux shorthand to describe hardware requirements for running a particular distro is often rendered “i386  (and i486, i686, &tc.)”.  “i”, in this context, refers to * intel *—specifically, their central processors.  “386” refers to their 386-era family of these processors, released into the product-stream a very long time ago (c. 1985).  All three [i386, i486, and i686 (and sometimes you see “i586”) ] are fitted under the general rubrick of “IA 86”—sometimes rendered “x86”.  A distro that says it is “i386 compliant” will most probably run (or perhaps just “walk”) on a very old computer, as well as probably a new one too.
I will name some “lighter” distros for you, in the rough order of my own preference.  These are “lighter-weight” versions of the more common distros, that will often run better on a box that’s having some trouble.
In such circumstance, you could try:
Linux Mint Fluxbox (this is the Fluxbox DE edition, not to be confused with mainline Linux Mint); Mepis AntiX (any reasonably current build—this is a         * pretty good * light distro); virtually any of the Puppies-Linux, like Wary Puppy 5x, or MacPup; Xubuntu (this is Ubuntu with XFCE instead of GNOME or Unity); Crunch Bang Statler; Crunch Bang 9.04 (archaic, but still a fave); Pinguy-mini; Ping-eee (Pinguy for netbooks:  this shows a lot of promise; but remember, at the time of this writing, at least, the netbook version is still pretty new).  Or maybe PCLinuxOS LXDE edition.
These are lighter-weight, less resource-demanding, than their main-line counterparts.
But everybody has their preferences, and you can be sure that others will post here.
It is also true that there are other hardware compatibility issues:  unfortunately, as I have indicated in much earlier paragraphs of this very document, I have to admit that desktop Linux just has a good deal of difficulty running on not a few machines.
Where it comes to CPU compatibility, I guess a simple rough-guide would be:
i386 = start of 32-bit Intel processors, c. 1985
i486 = improvement to i386
i586 = Pentium 1
i686 = Pentium Pro, Pentium 2, Pentium 3
i786 = Pentium 4

THE BOTTOM LINE I guess, is that (theoretically at least), a Linux distro that is tagged as “i386”, or “i386-compliant”, can be run on a “very old” computer, speaking very generally.  Theory is not always practice, but Linux deserves a lot of credit for putting-forth more effort (a lot more) toward maintaining its compatibility with existing machines, as opposed to expecting you to have to buy a new computer to run the next version of Linux.  This is perhaps the principal reason Linux is regarded as a means to “recycle” older, “useless” computers.
What I do recommend that you do, after having decided on a distro to try to  use on a daily-basis, after having tried them as live-cds, is to install it to a thumb-key (a flash-drive).  Flash-drive installation and running is becoming an increasingly perfected science, even as I write this.  And it is becoming more and more popular.  I have presented some information in this blog, as to how to install Linux to and run from a USB thumb key.  I have to say that I find * Universal USB Installer 1.8.6.3. * from Pendrivelinux.com an excellent starting-point in this regard.  It is easily run from WINDOWS 7 or Vista, and will probably also work from XP.  It is easy to create good thumb-drive installs with this.
Another worthy option is an external, add-on harddrive, or a micro-drive (mini-external-harddrive).  If you have the money to buy an external harddrive, this can be a handy way to get around the headache some computers have, when it comes to dual-booting Linux from the same drive which contains MS WINDOWS.  (I will insert here, however, that at least one acquaintance of mine has reported that her machine actually * has * the headache * when * booting * from * an * external * harddrive *.  So I guess a good rule-of-thumb in desktop Linux would be:  “to every rule-of-thumb in desktop Linux, there’s an exception”.)
My USB harddrive only cost about 100 USD, at the time of this writing.  But that can be a lot if you haven’t got it.  It is probly also true that an install to an external harddrive is easier to undo or re-format, than the traditional dual-boot setup—provided you are able enough with graphical disk-tools that you don’t accidentally modify the wrong drive or partition.  WINDOWS identifies drives and partitions with letters of the alphabet, usually starting from “C:\”.  (It used to be from “A:\”, but probly very few people today have a machine that would start this way.)  Linux disk-tools—like even many of the installer-programs—name drives and partitions with labels more like “hda, hdb”, or “sda, sdb1, sdb2”, &tc.  It is this business of making sure the installer or other “disk-tool” is “pointed at the right drive”, that is arguably the touchy-est part of installing Linux.
Note that these type of add-on drives were not invented with the intention that they would be on and running all the time, but were rather intended more for things like storage and backup.  However, technology advancing as it does, many people on long flights now seem to have them up-and-running for many hours without issue.  Remember that an external harddrive has no built-in cooling, so an external way of helping it dissipate heat may be advisable.  (Like maybe sitting it on a laptop cooler or netbook cooler).  As there is no protective chassis and computer case, anything that protects against unexpected vibration might be a good idea too.  Perhaps just operating it while it lays on one of those gel-filled vinyl masks people sometimes wear— across the region of the eyes to soothe and cool the blood vessels—might be an idea.  Such a thing will absorb and dissipate a fair amount of heat, and is good at dampening vibrations, such as those that can sometimes occur in takeoff and landing.  Drugstores often have them to sell.
If you would have been  able to have set-up your Linux with a separate root/Home partition, a re-install will be that much easier.  But this may be deeper water than a noob would want to wade-into, in his or her first dip in the pool.
The BIOS of some machines (especially older ones) does not have the option to boot from their USB hardware port/software interfacing.  This can be overcome by learning to make a special type of “boot-helper cd”.  You could look at the file “l boot from usb when bios is not able” (this’ll tell how to make a helper-cd using the PloP boot-loader), or Google the issue if it crops-up for you.  Sometimes this lack of USB-connection bootability can be remedied by just updating the BIOS’ firmware (the miniature operating system that drives BIOS).  This is sometimes referred-to as “re-flashing the BIOS”.  Better let a professional tackle this one, unless you’re already WELL on your way to being a hardware tech professional.  If a BIOS re-flash procedure doesn’t come off right, the machine will probly not be able to boot anything again, until it is straightened-out.
As I have already said in the foregoing text, some computers have difficulty with the usual “dual-boot” arrangement.  Exactly * why * this is occasionally the case is beyond the scope of this document.
Besides maybe  trying to run Linux from a USB-key or external harddrive, there can be other ways to try to deal with the “dual-boot headache”, if your machine has it.
One means might be to check-out “Grub4dos”.  There is a fair amount of information about this online.
On many machines, it is also possible to use the WINDOWS bootloader to boot your Linux, as well as WINDOWS.  WINDOWS XP uses a program called “NTLDR”.  You’ve probably never seen it displayed during a boot of a WINDOWS machine, as WINDOWS machines are just in the business of booting WINDOWS, by default.  When desktop Linux gets installed to the   computer, side-by-side with WINDOWS (the traditional dual-boot arrangement), then Grub gets installed and sits in the same partition as Ubuntu, or which-ever Linux.  But a very small part of Grub (I guess that would be what they call “stage 1”) gets installed to the first sector of the first harddisk, to  the MBR, so that it points everything at the main part of Grub at boot time.  There are multiple ways to arrange this, but the traditional dual-boot arrangement (for lack of a better term), seems to be the way most people run desktop Linux—where they install it to their harddisk.
So then Grub takes-over from the machine’s BIOS when booting-up, instead of Windows’ NTLDR.  Or at least I’m guessin’ that’s how it goes.  And then Grub will show you its entries in its menu for selecting WINDOWS, and also those for selecting Linux, and you pick what you want before it times-out (usually about 10 seconds).
But there are ways to get Grub hooked-into NTLDR, such that you can be presented with a choice of whether to load WINDOWS at boot-time, or else hand-off the process to Grub, which of course can boot Linux instead.
Sometimes this works, to cure a machine with the “dual-boot headache”.
You might wish to check-out the following link, if you want to consider this route.
http://www.icpug.org.uk/national/linnwin/step00-linnwin.htm
Where it comes to Vista and 7, all their boot stuff is in a folder.  I think it may be called ~ something-or-other BCD.  But anyway, Vista’s new boot-manager is in a folder.  So the situation has evolved, once again.
It might also be useful—since we’re getttin’ in deeper than we should here, anyway—to see that the Grub seems to understand and identify drives and partitions a little differently than, say, G-Parted or Windows Disk-Manager.
GNU-Grub seems to label stuff as “(hd0,0)”, and “(hd1,0)”.  This is if you really get into Grub guts—you won’t see this kind of jazz in your usual, ordinary Grub boot-menu.
Remember that UNIX/Linux usually counts from zero (0), not from one.  So “hd0” is just the first harddisk Grub can see.  Comma and then 0 or 1 is just its way of denoting a partition on that harddisk.  So “(hd1,2)” denotes the      * third * partition on the * second * harddisk (like if you had one of those expensive laptops that come with * two * harddrives inside, instead of just having one).
So what we see by this example is that the digits are “slipped” by one digit.  Hd0 is really harddrive 1.  Partition 2 is really the third partition on that disk.  Sorry it’s a little confusing, but you’ll get the hang of it.
Another idea is “Grub4dos”.  Google that.  There is information online.
AN UPGRADE [TO THE NEXT INCARNATION (RELEASE) OF YOUR LINUX         DISTRO] IS ESSENTIALLY THE SAME AS A RE-FORMAT.
Or it is a little more similar to a re-format than, say, upgrading XP to WINDOWS 7.  Both these operations can be hairy, actually.  And therefore should not be undertaken lightly.  REGARDLESS OF WHICH OS PLATFORM YOU ARE GOING TO UPGRADE—EITHER WINDOWS OR DESKTOP LINUX, YOU HAVE NOBODY BUT YOURSELF TO BLAME IF YOU DO NOT THOUROUGHLY BACKUP YOUR FILES FIRST, AND EXPERIENCE DATA-LOSS LATER.  Backup all your stuff with tools like WINDOWS backup and Restore center, or BackInTime-for-Linux, Deja Dup, Unison, or r-sync; or just manually copy your (Linux) Home Directory before you begin.  You may have to do one directory at a time, if your Home folder is a lot larger than your RAM.  This is as true in WINDOWS as it is in Linux.  That’s why people hate to upgrade.  It’s a PITA (Pain In The %^$@!%).
I also highly recommend that we backup all those Linux kernels that appear in Grub menu at boot time.  Some will call this unnecessary,     but I call it VERY necessary.  Yes, needed versions of the Linux kernel * can * be downloaded from the web; but why do that, when you can just make a backup-to-cd?  Nor do we ever really know if we will need     one of these older kernels in the future.  No, using one is not optimal.  But it can be a handy stop-gap solution, in case an update to the system should break some crucial app [this would be rare—but it has happened].  And what of that custom-kernel you may have compiled?      Best to back it all up.  It takes little space, and backup media today is not expensive [at least for us normal desktop users].

I DON’T RECOMMEND YOU USE THE UPGRADE BUTTON IN UBUNTU’S UPDATES MANAGER.  THIS WAS A NICE IDEA ON UBUNTU’S PART, BUT SEEMS NOT TO BE FULLY PERFECTED.

I prefer to upgrade Ubuntu by means of the Alternate Install cd for the new release.  Why this way?  I’m told we should not attempt to upgrade a desktop system with another desktop cd—and so the Aternate Install cd, whose .iso you find in the same download webpages where you find the regular cd iso’s.  Furthermore, Ubuntu’s ‘Upgrade Button’ (found at the top of Updates Manager) has its appeal—the appeal of convenience.  But it does not have such a groovy reputation among those who’ve used it.
By doing an upgrade with the Alternate Install cd, you get to check the hash value of the iso * BEFORE * using its code in installing/upgrading.  Ubuntu will complete the upgrade by means of the internet, once the Alternate cd’s installer program has been run.  So it will download a bunch of added code, and install that as it receives it over the internet.  But you got to hash-check the bulk of things, which were fetched from the Alt cd.  Whereas if you just used the Upgrade Button, then you’d not get the opportunity to hash-verify any of the code that the upgrader is installing to your computer—it would all come over the network.  Nowadays this is generally reliable; but under various circumstances the data can be corrupted in transmission, and you might have to start all over—perhaps with a crippled system.  Why not just go to the small trouble of downloading, hash-checking, and burning the Alternate Install cd at upgrade time, and stat-out with at least 2/3 of the code verified by the hash?  If it’s a bad download, the hash won’t match, and you can throw the iso away before you waste a cd on burning it.
It is also best to be hooked-up to the internet while doing this.  A hard connection like Ethernet/DSL/Firewire is probably better for this operation, than WiFi.  You download and burn the Alternate Install cd just like you do for the normal i-386 cd.  For Ubuntu, it will usually preserve your personal files and data, same as WINDOWS upgrade.  But you’ll have backed-it-up anyway, if you’re smart.
Because * ANY * upgrade can go-awry—including with WINDOWS.
BUT BE AWARE THAT DESKTOP LINUX IS * EASIER * TO RE-FORMAT WITH, GENERALLY, THAN OTHER OS s (MAC and WINDOWS)—* IF * YOU TAKE THE TROUBLE TO LEARN THE MINIMUM-NECESSARY THINGS ABOUT RE-LOADING MODERN DESKTOP LINUX (Ubuntu, Mint, Pinguy, or whatever one you’re dealing-with), AND TAKE NOTE OF WHAT I HAVE HAD TO SAY UPON THE SUBJECT IN THIS DOCUMENT.  Which is not asking that much of you.  A geeky friend, who understands modern desktop Linux and might be available to come to your house and help you if you get stuck, is also nice.  But remember:  desktop Linux users are scattered, and so all over the world.  If you can’t find a local LUG to join, you are gonna be much more on-your-own than a WINDOWS user—especially at upgrade-time.  Desktop Linux is still a lot more DIY than WINDOWS or MAC, at the time of this writing (early 2012).  Research it well first, before you try to do an upgrade to a newer version of desktop Linux.
I will add right here that, as I may have already said, a key thing we should all understand where it comes to a re-install of desktop Linux, is that Grub has two main parts:  Grub “stage 1”, and Grub “stage 2”.  The first time you install Ubuntu (or probly most other distros other than KNOPPIX and perhaps Puppy) to a dual-boot arrangement (which is the conventional wisdom), and it will be stored side-by-side with your ms WINDOWS, and therefore you will be able to boot either of them from your harddrive, THEN GRUB’S STAGE 1 GETS INSTALLED TO YOUR MBR, over-writing what is there, so that when you hit that power-switch to boot-up, your BIOS bumps-into stage 1 before anything else, and then Grub s-1 will execute.  When Grub stage 1 executes, it does just one thing:  it looks around for it’s “better-half” (Grub stage 2), and then hands-off the boot process to * that *.  Grub stage 2 sits in your Ubuntu partition, not in your MBR.  Stage 2 is the meat-and-potatoes part of Grub [whether we’re talking about Grub 2 or Grub legacy:  either Grub has these two major parts, and this is what they do.  Yes, there is (sometimes) an intermediate stage:  Grub Stage 1.5—often expressed as “Grub stage 1_5”; but this is an option.]  When you boot into ms WINDOWS, the process is essentially:  switch-on > POST > BIOS > MBR > Grub stage 1 > Grub stage 2 > the Grub menu you see > select ms WINDOWS with up-down arrow keys > Grub stage 2 is now acting as the master bootloader—no more Windows Vista/7 boot folder (or XP’s ntloader):  Grub stg 2 look for ms Windows in partition you have selected, now it pulls it out of its partition storage, and hands booting to it; > you see “starting microsoft Windows” or rel. on your screen.  This is properly called “Grub chainloading Windows”.  Because Grub s-1 hands off to stage 2, and that to ms Windows.  For booting Ubuntu, the chain is essentially the same.
SO IF YOU HAVE STRICKEN UBUNTU FROM ITS PARTITION IN ORDER TO RE-FORMAT/CLEAN INSTALL IT [BECAUSE YOU PERHAPS THINK YOU HAVE GOTTEN A ROOTKIT OR SIMILAR LINUX-ABLE MALWARE (VERY RARE)], THEN GRUB STAGE 2 WILL ALSO BE GONE, AND THEREFORE NO LONGER IN THE CHAIN.  SO NOT EVEN WINDOWS WILL BE BOOTABLE, EVEN IF YOU HAVE NOT MESSED WITH IT OR ANY PARTITION OTHER THAN THE LINUX ONE.  So the thing to do is to re-install the Linux right away.  Doing so should restore the “chain”–just as it built the chain the first time you installed desktop Linux to create a dual-boot arrangement.
This kind of jazz (as well as other risks and potential annoyance/PITA I could mention) is a prime example of why I prefer running desktop Linux from an external USB harddrive.  Another example of why I prefer an external hdd over the traditional dual-boot arrangement is that it seems to get you out-of ACPI-issues.  At least it let me ditch ACPI issues altogether where it comes to the lappy I use the external drive with [a Toshiba L 515 Sat].  Using the external, I find that I did not have to adjust the parameters of the boot process, and the fan and other stuff works normally under Ubuntu 10.04.3’s stock, default configuration.
Running desktop Linux from a USB thumb-drive install also offers some advantages.  But the more I do it this way, the more I am persuaded that 1) it is not perfected yet (remember this was written in late 2011-early 2012); and 2) if you actually * use * much of the persistence that’s configurable, things start to get s-l-o-w.  A full-hdd type install to USB thumb does not seem to offer much more, as it still seems to end-up being a buggier, less functional result.  But it is VERY GOOD for testing different distros.  For one thing, desktop Linux will run a lot faster from a thumb-key, than from a live-cd—regardless of whether you’ve done the install to the thumb-key as the usual “.iso-type/compressed install, or whether you went ahead and (carefully) fully-installed desktop Linux to (an adequately sized) USB thumbstick.  Thumb-keys will run desktop Linux at essentially full-speed.  For another, it is booting from a * different * drive * than one which contains ms WINDOWS.  And this seems to abate a lot of issues.
Before re-installing or upgrading desktop Linux, you should download and burn a normal desktop cd of the new release first, and play with it extensively in a live-session first, just like you are supposed to do before you install desktop Linux for the first time.  If it seems like you are not gonna be able to fix any hardware-compatibility problems, then you might wanna consider switching to a different distro.  (WINDOWS doesn’t let you try-out the new version as a live, “no-obligation” session on your existing computer, by the way.  Existing (XP) hardware will often NOT properly run VISTA or WINDOWS 7, by the way.  So most people often just go to a store and buy a newer computer, with WINDOWS 7 already installed.  Even though there is nothing wrong with their present computer.  Many of us don’t seriously object to this, because we see it as part of the normal and to-be-expected progression of technology—both operating software * and * physical hardware.  Desktop Linux * can * stretch-out the usable lifespan of a laptop, tower, or other hardware platform.  Often by many years, or even a decade.  But to many of us—even taking into account the global economic downturn—this is rather negligible.  Most people sort of expect a move-up to more powerful hardware.  Or they just don’t give it any thought.  Myself, I am chiefly interested in desktop Linux for privacy and security issues.
As I indicated, most people HATE to have to upgrade to a newer operating environment—whether they are WINDOWS people or casual users of desktop Linux.  It is respective of this that many of us find the LTS (Long Term Support) versions of many of the popular Linux desktop distros to be attractive.  Ubuntu LTS versions are published in April every other even-numbered year [i.e. 2008, 2010 (Lucid Lynx), and now 2012 (Precise Pangolin).  And each of these will be supported for three (3) years, from the day it is first made available online, in its “final form”.  So Ubuntu 10.04 (Lucid Lynx) will be able to be kept current until April 2013.
MS WINDOWS, by contrast, upgrades “when Microsoft thinks it’s time”—not on a fixed-schedule, like most desktop Linux.  We “enjoyed” XP for about ten (10) years, which frankly is an inordinately long run for any microcomputing OS.  And it was buggy as hell when it first came-out—bad driver-support and all the rest.  Which people seem to have forgotten.
WINDOWS 7 may not enjoy as long of a lifespan.  Earlier versions of WINDOWS mostly did not.
Still, WINDOWS users can still probably just get the young man behind the counter of the big-box store where they’re gonna buy their computer with WINDOWS 8 desktop (or whatever the final name ends-up being), to migrate their data electronically to their new machine using an automated process.  They will trust him (or her), and shop while the process is under-way, returning to the computer-counter in a few hours.  If something goes-awry, and some (or perhaps all) of their data is lost or corrupted, they will probably just return to the store with the machines when somebody else is working, and try again.  Even if they do not get satisfaction, most people don’t seem to blame Microsoft.
Most people I have talked-to about this seem to blame themselves for “not being sufficiently affluent that they would be able to move-up to a MAC”.
If I attempt to interject desktop Linux into the conversation, I find many people make a face, like the kind we all probably make when straining on the toilet.
Or I am met with the verbal response, “I had a boyfriend who used that, once.”  Or, “What is * that * ?”.
But I’ll add this:  IF YOU EVER NEED TO RE-FORMAT * BETWEEN * UPGRADES, IT’LL LIKELY BE EASIER (AND QUICKER) IF YOU’RE A DESKTOP LINUX USER, THAN IF YOU WERE A WINDOWS USER (and you have paid attention to what I have rendered-forth here, as to re-formatting/upgrading, and you also research the issue somewhat yourself).  Why is this?  Because Linux is free, so it doesn’t have to insert things (like some “DRM” code) to help prevent people from pirating it.  And it is often this type of DRM-code that makes WINDOWS a pain to re-format with—say if you get a bad virus, root-kit, or some other bad malware.  This scenario seems less likely these days, as most contemporary malware (at the time of this writing) seems to be just spyware—stuff just intended to track you and phish you, and maybe get your credit-card number, bank number, or SSN/its equivalent if you’re a national of a nation other than the U.S.  Most contemporary malware does not seem to be aimed at deleting your files, or wrecking your WINDOWS system.  (But we still don’t know what will be let-loose on the web next week, do we?)
None of us know what malware may be released onto the web next Tuesday.  We have not seen the like of the Love-bug virus in many years now—partly thanks to the fact that many WINDOWS users seem to have wised-up enough to install some descent anti-virus program.  And probably more to the fact that there is illicit money to be made in getting certain chunks of your personal data—but not in crashing-and-trashing your computer.
This is another reason so many people seem to “wish they could afford a MAC”:  * NOT * because it is really a BSD/Linux-type system underneath; but rather because, while it is * possible * to write malware for MAC, it seems to be just mostly left-alone.  This is also true for desktop Linux, by the way.  Desktop Linux is mostly left unmolested.  There are many competing theories as to why this seems to be the case.  Or you can make-up your own, if you like.
If desktop Linux gets seriously broken, it is often better to just re-format/reinstall, as this is usually much easier than re-formatting ms WINDOWS.  Usually even moreso, if you have made a separate Home partition for yourself.  And provided, as I said, you know what to do.   Just be sure to backup all your personal files, and your old Linux kernels [if you have any in the system].  You don’t know what you wish you had backed-up, until it’s too late, and you’ve over-formatted it.
71. Entry 71:  IT KEEPS CRASHING?    If your Linux install (to either a hdd or thumb-key) is crashy, something is probably seriously off-base.  Linux, properly installed, is very stable—assuming it is copasetic with your hardware.  Individual applications may be “crashy”, and need to be adjusted/updated/et al; but the Linux os itself should not be “crashy”.  I am come to the conclusion, at long last, that Linux just will not run on some machines.  Retrace your steps.  If you can’t solve it in a reasonable time, try another distro.  That’s why there are many distros.
72. Entry 72:  WHAT IS “BORK MY INSTALL”?        “Bork” is just slang for break, as to cause to no longer function.  If you make a serious mistake in Linux, and cause your Linux install to no longer work (rare), you can just re-install it from your live cd, or upgrade to the next release.  Linux is not that hard to re-install, if you can just pay attention to what you’re doing.
73. Entry 73:  LINUX HAS COME WITH NTFS SUPPORT FOR SOME TIME NOW:  Time was, Linux could only read files from the WINDOWS universe that were on the v-FAT format; however, now Linux has NTFS support, so it has no trouble reading files that happen to be sitting on a Windows NT, 98, 98SE, Windows ME, 2000, XP, or Vista/7 system.  Linux can write to them, too.  And WINDOWS can read most UNIX/Linux files formats, though the installation of an additional (third-party) software is needed.
74. Entry 74:  WHAT THE HECK IS “HOME DIRECTORY”?
This is just the same as your “personal” folder in Windows, underneath which is “My Documents”, “Pictures”, “Videos”, “Music”, “Favorites”, and probly also some folders you have created and named yourself.  If you click on Start in Windows, you will notice that toward the upper-right of the Start Menu, there is an icon with the name of your user account.  You have assigned yourself some user-name, sometime ago when you created an account for yourself on the system, or when a friend imported your old settings from a previous computer you had.  You are allowed to set the account to either Administrator or limited-user, on a Windows system.  You can create any number of other accounts on Windows, for yourself, or for family members who also want space on the computer.  These can be either limited-user [without Administrator powers—like to install some software without asking the Administrator (your dad), or else some of ’em can be Administrator, too (mom * and * dad)], so there can be more than one “boss” of the system.
In Linux, by contrast, you can be either Administrator (“Root”), or you can be limited-user:  “Fido”-builds of Puppy Linux let the “boss” have the Administrator/Root account, and everybody else is intended to have their own space (if you desire to set-up Wary Puppy or Racy Puppy this way, for example); but the way Linux-systems are structured, the general intention is that there be only one “Ultimate Boss” of the system.  So if you set-up a Fido-user account for your daughter on Wary Puppy 5.x, she will not be able to install any new programs to the system without your permission, and this would be in the form of somebody typing-in the Root/Administrator password, which of course was set by the boss when Wary was being used for the first time.  This password can be changed later-on, and any number of times.  Be careful, though:  if you lose it/forget it, it’ll be somewhat of a hassle to reset it.
Of course your daughter can figure-out how to do this, too (reset it), and without having to know the existing Root password/First User password (if she is like, over the age of 11, or is a computer-whiz.  And of course she has physical, hands-on access to the machine, which is a huge advantage over, say, a black-hat hacker out there in cyber-space.)  But like I said, it still involves some hassle, and also you’d probably know the Root password had been messed-with, the next time you got on the system (at least if you were logging-on as Root).  And there are ways to set a Linux box so that such tampering becomes very, very difficult, which I shall come to.
There are other ways to tell, too, like checking certain log-files which both Linux systems and Windows systems save by default.  I am told that there are ways to limit access to these particular folders—even from the root, so that they are protected against tampering with yet another hard password, which only the root-administrator has.  I have yet to try this for myself, however.
If other people have Administrator-level accounts on your WINDOWS system, however, they have the same powers as YOU, and it is somewhat more difficult to alter this paradigm, without just downgrading your daughter to limited user.  Linux, on the other hand, is set-up with more granular, graduated account control by default.  [Or maybe it is not:  you could try going to the link http://www.zdnet.com/blog/perlow/browser-protection-the-next-generation/12790 , and read the first post in the comments-section by “honeymonster”.  Really, both WINDOWS 7 * and * modern desktop Linux are both very capable systems; and we can kick all of these arguments back-and-forth (and people have).  But I yet maintain that one can configure his-or-her desktop operating system to be * MUCH * more secure, by learning Linux.  I guess we’ll see.  I will say this, if I haven’t already:  WINDOWS is so “historic”—in having been around for so * long * as a desktop, and has been and * is * so ubiquitous, that there are just hacker-wares readily available, which anyone can download, and some of which do not even require computer competence at a high level.]     WINDOWS User Account Control feature may be easier to use (?), but allows less fine-tuning—at least the way it comes, by default.  I guess you could just bone-up on WINDOWS Permissions-settings, which in Windows * SEVEN * is easier anyway, than it was in previous versions of ms Windows.  But I still tend to think Linux-systems are more secure, as they come by default.
My recommendation is that you operate Linux from a User Account, unless maybe you are using Puppy, and it is running as it is intended—as a live-only file-system (this means “other than a full, traditional-type hardddrive install”.  This basically amounts, then, to it being run as either a live cd/DVD, or as a “Frugal” install to a harddrive).  Even if you are using a distro that will allow you to run as True Root at all (like KNOPPIX or Fedora), DON’T DO IT.  Unless you are already an advanced user, and fully understand what you’re doing.
Use something like the “sudo” utility, or, more to the point, just be content with typing your password into a dialog-box when installing a new software, or doing something else that requires Administrator permission.  Root/Admin account use in Linux is only necessary for situations like, say, somebody that is running a Linux * SERVER * who might be trying to repair some serious problem he (or she) is having—and then only as a last resort.  There are plenty of other ways to fix a problem in * desktop * Linux if you develop one, besides trying to log-on as Root/Admin.  And all configuration of the system can be accomplished without accessing the actual Root account itself (at least on most Debiian-based distros)—for instance by using the “sudo” command (which comes with Ubuntu and most of its close relatives, by default)—or, more-to-the-point, with graphical dialogs like GNOME configuration editor, or the “Preferences” menu.  Try never to log-on to any Linux-based system as actual Root:  use a graphical tool from a non-root account, or something like SUDO—unless you’re a truly expert user or a developer.
Busting a bad move as True Root can break (“bork”) your system, and there ain’t no System Restore or restore-points like in WINDOWS.  [Actually, maybe there is:  check-out BackInTime-for-Linux, and MondoRescue.]  If your desktop Linux gets very messed-up, you will probly have to re-install it.  This procedure is a lot easier and quicker than with WINDOWS, but it’ll wipe-out personal data, so you’d need to remember to back that up first.  This is pretty easy to do, though.  You can off-load/copy pretty much any personal files from a Linux partition, just by using the files-manager.  Even if you are using it from a live environment, like a cd or a USB thumb.  You could just use the cd you installed from as a rescue cd—just run it as a live-session like you did when you installed, and use Nautilus or other file-manager to drag files from your harddrive to another location—like a USB thumb-key, or a cd-creator like Brasero (included in Ubuntu by default).  You’d have to have a pc with more than one cd-tray for this last way of backing-up personal files.  But not for the thumb-key method.
I will add that the Puppies-Linux are rather exceptional, in the way their Linux file-system is laid-out—though this is accomplished in such a way that it does not make it any harder for you, the newbie-user.  In fact, the lay-out is exceptionally easy.  It’s just done somewhat differently than other Linux, to enable the Puppies to run as a whole live-file-system, and completely from the RAM-disk, instead of just invoking the files needed at any given moment—which is the way most other os-es run, at bottom—whether they are Windows or Linux or what.  This modus gives Puppy Linux certain advantages, more about which in other documents.
When it comes to most other Linux distros, you are set to run from the limited-user by default, and you install software or other Root/Admin stuff by knowing the “First User account”   password that you set when you first installed the distro.  This usually means you are graphically prompted, just like in Windows (assuming you did not dis-able User Account Control in ms Windows).  Or you can use the “sudo” or “su” prefix if you are gonna do some such task in a Terminal/command-line, which will then cause the system to prompt you to type the Root password for authentication.  In some distros (like I think MEPIS), this “sudo” feature is not already enabled, by default, so you may have to set it up.  MEPIS is a pretty easy distro, though, generally speaking.
In many “RPM-based” systems (like Fedora, for example), you set-up a Root account, and then a limited-user account, the first time you run the system, and each with it’s own password, which you dream-up yourself, and which can be changed later-on.  But I would write all this needed setting stuff down on a pad beforehand—the hard passwords I intended to use, the nickname I intend to set as my Linux’s hostname (nickname of the Linux area of the computer itself), and the user account nickname(s) (mine is “flyingfisherman”:  notice no spaces between words—it needs to be like a “string” for this thing).  Make these decisions ahead of time.
But in most Debian/Ubuntu-type Linux distros, you just do everything from a limited-user, and you may not even be given the real Root/Admin password—because the way these distros are built, you won’t need it—it’ll let you just use the password of the first user that gets created by the system the first time it boots, and this person (and his/her password) become the “boss” from then on, no matter how many other user accounts are added after.  This makes the system more secure, arguably, because any virus or malware that might be out-there (or be developed by somebody next Tuesday) will have a harder time infecting the real Root account.  Because in these type distros (Ubuntu, microKNOPPIX, Linux Mint, other Ubuntu variants, other distros—check the distro’s documentation), the Real Root account is hidden, and it is only used by the system itself, just for various operations in the “background”, so-to-speak.  The “boss” password (password for the First User account) can, of course, be changed later-on, if the boss decides he (or she) wants to do so.  Accounts can be re-set, too.  And there can be more than one Administrator on the Linux system, if you really want.  So (in Ubuntu, at least), both you and your wife could be “Administrators”.  If you really want to.
I say “arguably”, because some people do argue with this way of laying-out a distro (Ubuntu and its variants, for example), and they assert that this is really no better than a more “traditional” accounts-management schema.  Well, I think it’s rather a judgment-call.  Your security in any system—even Linux—is only about as good as your hard Root/Admin/Primary-Ubuntu-User- Password, anyway.  If you just can’t stop yourself from using “Biff” (your dog’s name) for the first password you set in Ubuntu, then you ARE NOT gonna have much greater security advantages than you would in WINDOWS.  (You will, however, still be * somewhat * more secure, just because Linux is more secure by default.)
If, on the other hand, you learn to create a truly hard-password (Google this), and are willing to actually type-in that sucker every time you want to do some serious operation to a properly set-up Linux system, then it * WILL * be more secure than WINDOWS, even without further adjustments or customization.
[When you first boot Fedora, unlike Ubuntu and some other distros, Fedora has you create a Root account and a user account, and assign each a password that you make-up, out of you imagination.  So maybe you should have a margarita first.  Or I guess you could try using a random hexidecimal number generator, of which there are free services on-line.  There is even a service which creates a strong password for you to use (or not use, if you decide you don’t like it), and which patterns these on combinations of words, numbers, and phrases, which actually make intelligible sense, and so are easy to remember—though they are peppered with stuff like the “@” symbol, and other special characters from your computer’s keyboard—which makes the password hard to “crack”, for a black-hat hacker.  [look for pwgen (which is packaged by Debian)]   NOTE that Ubuntu comes with a “password generator”, already installed in the system, and ready to use.  I have never bothered with it, however:  I’m used to just making-up stuff I can usually remember, and keeping the  rest in a small notebook, in which I have written the passwords in a code of my own making.  (And I like the taste of lime, too.)  There is also a program called Diceware.  I don’t know much about it, but it has a Wikipedia article giving general info about it.
If somebody found the paper notebook, it would be easy to crack my simple cypher-code.  But why would they?  I have written no clue in there, as to what kind of information it is, and my cypher is in badly written cursive, with a ball-point pen.  Anyone would have difficulty in entering the writing into a cracker program, by typing it into a keyboard, and OCR is obviously out (at least at the time of this writing, anyway).  Nor do I have my name, or my real date-of-birth written in there, nor much else that would easily lead somebody to my computer’s hostname and ip-address.  Without this critical info, (and other critical data I did not mention here, but you can figure it out if you think a little about it), somebody would have a much harder time black-hatting my computer itself.  What they’d have readily would be my  passwords and user-names for online sites like Hulu and YouTube.  But just like somebody whose wallet was stolen and has to re-set their debit-cards and re-claim their traveler’s checks, I can just start-over with Hulu and YouTube.  I don’t store data in them anyway.  Any really critical data I have is stored off-line.  For example, I have never entered my bank-account number into any of my computers.  I don’t do online banking at all, and if I have to pay for something online, I use a temporary visa card purchased at a (reputable) store-front check-cashing/payday-loan business.  [These type of cards can also be obtained from the service-counter at many discount and big-box stores.  If you research them first, and know what you’re getting, they are remarkably safe.]
Just like somebody who loses their wallet, it’s not your stoopid customer-appreciation/frequent-buyer membership card from your local fresh-fruit vendor, or a picture of your dog sleeping on the laundry-basket that you’re worried about:  those can be replaced/replicated, with a little hassle.  What you’re worried about somebody getting is your SSN, Visa card number, personal photos, work-related documents, &tc.  With Visa you have little choice; but the rest you try not to carry-around in your wallet.  If you’re smart.  Of course, I realize some of us have * no choice * but to enter some of these data-types.  Because of the nature of one’s business, or some other reason.  More about all this security-oriented stuff in other documents.
Google:  “how to create a strong password”.
Your real security starts with strong passwords for:  1) your “First User”/“Primary”-user in Ubuntu—the first account created on the system at the time of install or persistence-configuration/casper-rw:  more about this later; and 2) any additional accounts you set-up later.  Most Ubuntu users probably don’t, but if you establish yourself in an additional user account created soon after you begin using Ubuntu, and configure a strong password for * that *, and use this as your main account to operate-from, then you will have some added security, because, as you will only be required to type the strong password of the * Primary *-user account when installing a software (or something that requires a super-user level of privilege), then it will be much harder for some black-hat to “sniff” the strong password of the Primary account.  Because it’s not the passphrase that you are then using to log-on every day, and so it gets typed much LESS frequently.  And this can make it harder on a black-hat hacker’s software keylogger-bot program.  And so they will have a significantly harder time doing anything of consequence to your system.  We have to learn to defend ourselves from the “script-kiddies” first, if we’re at all interested in bigger security concerns.  Just be sure that the password for this additional user-account is not “sudo-enabled”:  but only the First User account’s password will work with sudo.
Ubuntu’s default configuration of a “Primary” limited-user-with-the-sudo-utility-for-Linux pre-installed is supposed to effectively thwart exactly such attacks.  But my feeling is it is easily circumvented, with to-day’s level of speed, and other improvements to exploitation/pen-testing softwares.  Setting-up Ubuntu (or one of its many variants) as I just described sounds like a lot more trouble; but it is not.  It is no more trouble at all than doing the recommended setup of WINDOWS 7—making an “Admin” (in Linux we say “root”), and then an ordinary user-account to operate the system-from, almost never needing to switch into Admin (well, theoretically anyway).  And running with UAC (User Account Control) turned-on all the time.  The trouble with this, of course, is that—even in the WINDOWS 7-era—WINDOWS just will not do certain things from its user accounts, and so many people run their account as Admin.  Which is less secure.
Modern desktop Linux, on the other hand, will do * everything * from a limited-user account.  Everything it is capable of doing at all.  Period.  So you don’t have to run as “root”.  You don’t even have to run as “Primary” user.  You just have to refrain from forgetting your passwords—especially for your Primary (and, if using a system like, say, Fedora, “root”).  Even if you mislay them, as long as you actually have physical, real-world access to the machine, there are ways to recover them.  If you want to use Linux to run a headless server, or are away on vacation and cannot physically access the machine, things will be tougher if you forget passwords, though.  More on all  this later.
At least one thing further:  when setting-up an Ubuntu system (or somethin’ based-on Ubuntu), if you then create additional user-accounts (beyond the Primary account), these will have a few extra restrictions on what is allowed to be done from them, and they will be somewhat different restrictions from what a limited-user has by default on a WINDOWS 7 system.  BUT THESE DEFAULT SETTINGS CAN BE CHANGED, especially from the Primary (first-user) account.
So Linux, generally, is more “Layer-on-Layer”, yet still more OPEN to customization.  While WINDOWS, on the other hand, is more like a bar-room fist-fight:  everybody usually starts out as equal (at least theoretically).  Especially with everybody on the system able to login as an ADMIN, and UAC dis-abled—which I notice is how a lot of families set-up WINDOWS.  And then, as the dust settles, one of the Fighters emerges as “boss”.  Which could mean You, your college-age son, or the black-hat hacker who just made your harddrive part of his distributed “bot-network”.
Yes, you * can * set-up Ubuntu so that more than one account can be Administrator, just like on WINDOWS.  But unless your wife needed to be Administrator too, and you did not wish to share the First Account password with her, why would you want to?
MS WINDOWS has come a long way, but it is still based-on the old CP/M master control program—and this began as what amounted to a hobbyist, single-user program for people at the meetings of the Home Brew Computer Club.  I do not mean to deride the HBCC in any way.  Nor to deride even WINDOWS.  Three of HBCC’s members, after all, were Steven Jobs, Steve “Woz” Wozniak, and Bill Gates.  The point I would make here, is that WINDOWS was (and still is) based-on a feeble single-user program for feeble personal computers that people used to order as a kit of parts, through the mail.  That doesn’t make WINDOWS bad.  It just means (in this context) that WINDOWS may still lack some of the inherent advantages of an operating-system built for * mainframe * computers *.  And Linux, as I may have said, was and is based-on an OS used for mainframes.  This difference probably becomes less important with the passage of time.  But it may yet be that not enough time has passed, that it does not still have bearing.
So as regards system security, WINDOWS is playing catch-up ball.  Has it finally caught-up, with its latest iteration in WIN 7?  People in-the-know have differing opinions.  I tend to doubt that WINDOWS has caught-up to Linux, but as I say, it’s rather a judgement-call, and the issue has been argued back-and-forth.  And * I * am not a programmer.  (Yet.)  But I tend to feel that, because Linux offers so many permutations, and alternate browsers and desktop-environments/window-managers, that this “makes the gene-pool a lot more diffuse”, and so it is just harder to write an effective malware for Linux Desktop.  It’s also true that (to repeat myself—and I hope this is the last time—because all this computer-stuff is boring enough to begin with), well, the desktop Linux communities have a set of preferred methods in place for distributing softwares which is more rigorously proctored than the way most of us try to “securely download” stuff in WINDOWS.  And Linux distros have a track-record of updating the os and most of its installed software * much * more quickly than the MS corporation.  And further, desktop Linux and its distributed community—occasionally flawed though it may be—is a lot more dynamic than the 400,000 ton aircraft-carrier that is Microsoft.  So security-hole exploits that occur in Linux tend to get reported more quickly, and especially they tend to get fixed/patched a lot more quickly.  I put a lot of this down to the fact that most banks use Linux—especially on their mainframes; and about 51% (or so) of the server-computers in the world are…..Linux-boxes.  And you don’t want your server hacked, so there is a good deal of incentive for Linux server admin.s to read forums regularly, and to communicate and co-operate.  This is also true of admin.s of Windows servers, but as I’ve said, the server market is one in which Windows has a * minority * share, and therefore may have less total hours of connectivity with other hosts than Linux or Apple servers.  Apple, let’s not forget, is really a Linux/BSD-type system, underneath.
Remember, too, that there * is * no way to stop a determined hacker.  But it is * possible * to set-up a Linux machine to make it a lot more difficult for intruders to trespass into.  For one thing, we can experiment with running Linux as a compressed-filesystem—either from a live-cd, or as a frugal-install/”poor-man’s-install”/PMI.  Which is a capability that is much more difficult with Windows.  (Or so it appears at the time I write this.  Remember:  one of the most prominent features of computing is that it’s always changing).  My findings in regard to running desktop Linux as a compressed file-system to date are, stated concisely, as follows:
The Casper-RW utility for saving changes works, but is probly best kept as small as possible.  (The Puppies Linux, as I’ve indicated, are yet another kettle-of-fish.  More about them elsewhere.)  Why does persitent-saving (which usually means Casper) seem to work best if it is little, and all or nearly all of the programs/softwares/packages you intend to use are already in the iso image, at the time you do the frugal install?  I don’t know.  But it seems to just * be * this way.  Doesn’t seem to matter, though; you can use RemasterSys to create your own .iso of desktop Linux, then run that as your compressed/.iso-type filesystem, and just use the persistence feature for the few preferences and settings you might have forgotten to add.  Puppy comes with it’s own re-mastering program, which seems to work excellantly.

Myself, I prefer the frugal-install type, as opposed to running from a DVD.  It’s faster, the mechanics involved seem more reliable, and it seems just as secure.  And it doesn’t tie-up your cd-tray—a potential factor for laptop users with only one tray, who aren’t satisfied with Puppy.  Of course there are add-on USB-powered trays on the market.  But these can have compatibility problems which a built-in cd-drive that is part of the manufacturer’s default hardware config is less likely to experience.  Again, there are the puppies, and other (small) distros that run wholly from the RAM, and so the boot-cd can be removed from the tray after boot fully finishes.  And of course there are nowadays ways to easily make bootable USB thumb-keys yourself.  But my preferred method of running desktop Linux as compressed is still as a frugal install on  my harddrive.  Custom-iso with RemasterSys, and persistence just for things I forgot, or some program I later find I want.  And once this configuration is set-up and in use, it will be rather hard to crack/blackhat-hack.  There are still ways—but if you’re on-the-ball as system admin. (or even just a good User), then it will be very difficult for somebody to hack you running this “frugal install” of Linux—at least as of the time I write this.  And I should probably not neglect mention of the “VM” way of running Linux:  I probably already mentioned this, but it is possible to install a small program to your Windows system, which creates a standardized “container” from which you can run varrious Linux images published just for this purpose, or other types of operating system as a “Virtual Machine image”.  Remember:  it’s always changing.  So by the time I’ve been able to post this, there may be some new development.
I’ll make the caveat here, before somebody else does it for me, that a fairly tech-savvy teen with too much disposable time AND PHYSICAL ACCESS to the machine, can defeat the Linux Primary account’s hard password, by running a rescueware from the cd-tray.  These rescuewares are free to download and burn to a cd, and from pretty much any computer—Linux or WINDOWS.  There is no absolute fix for this, but you can make it VERY hard for the unauthorized person who wants to try it, and this probly by just setting a (different) hard password for GRUB (or whatever bootloader you have), and a different (hard) one yet for the machine’s BIOS.  (Or just run Linux from your own thumb-key—though then you won’t be able to inspect the contents of your son’s Linux, if HE’s using HIS OWN thumb.)  But you will not likely need these other two passwords again until it’s time to Upgrade, which in Ubuntu [LTS] comes only about every 3 ½ years (assuming you use the LTS build, like me).  (So be sure not to lose them.)  Only the usual account passwords would be required for any authorized user(s) to log-on.  BIOS and GRUB hard passwords protect against a hardware-hack, where somebody uses a live-cd rescue tool to watch unauthorized content on the computer when you’re not at home, and then erases their tracks when they’re done.  So if you set a hard password for your Grub and one for your machine’s BIOS, then you probably will not be required to enter them just to log-on to the system in the morning; such passwords as these are generally only asked-for if somebody is trying to change the Grub or BIOS settings.  But it’s good to have these passwords in-place, to prevent your teenagers (or your butler) from using some recue-disk they downloaded at the public library to hack the machine, and get-into stuff they’re not supposed-to.  And of course these type of hardware attacks apply just as well to ms Windows—or moreso.
So Home/home directory is just some computing space for you to use, inside the mainframe computer which is almost as big as the basement of the office building.  Congratulations !!  Because of advances in miniaturaization and software, you now have a (Linux) mainframe on your desk!  This Home-directory (Home-folder) is allocated processing-space for you to use, and no other users on the system can use some of it, unless they know your password.  So it’s like a private room, just for you, with a combination lock.  This is so whether your account is set with Administrator privileges or is  a limited-user which the person in-charge of the system created for you.  Don’t fret about what the size of this room is or “should be”.  Linux automatically allocates computing space.  (But it does not “mix-up” the space of user-accounts.  No system does.)  If you set up Linux on the computer, and you are the one who set the passwords, then you are the “primary user”, which effectively makes you the system “owner” (the administrator).  Other user accounts can be created later-on, if you wish.  You could create one for your girlfriend.  Most modern Linux let this be easily done, from menus.  It will ask for the password(s) you set, when you set-up the system, though.
If you would have been  able to have set-up your Linux with a separate root/Home partition, a re-install will be that much easier.  But this may be deeper water than a noob would want to wade-into, in his or her first dip in the pool.
As an afterthought, I will tack-on here that AppArmor, and more-to-the-point SELinux (Secure Environment Linux) are things worth checking-into as regards system-security—but know that you will be wading out of the “shallow end of the pool”.  It will take a little time to become acquainted with these (especially the SELinux app); however there is plenty of documentation and discussion available online.
A good notion to keep in mind, if you’re privacy/security-oriented (like me), is also one of the more obvious ones:  ** Don’t ** set ** your ** hostname ** and ** other ** settings ** to ** your ** real ** name ** and ** street ** address **.  Use only aliases, or “nick-names”, if you can (some of us are stuck with use of our real name, because of situations with where we work, and the like).  This is perfectly legal in the United States, at the time of this writing.  It is also wide-spread practice, and does not in-and-of-itself make you a bad “netizen”.  Of course, you can do it in Windows, too.  But because Windows costs money to get (often it is about ½ of the purchase price of a new computer or similar device), and because it is difficult to re-install if it gets messed-up, people are more reluctant to try to form some meaningful business relationship with Microsoft (or one of its designated third-party vendors, such as Digital River, for example) while calling themselves only “Biffy_101”, or the like.
75. Entry 75:  OKAY, BUDDY, SO SUPPOSE I WANT TO HAVE GOOD LINUX SYSTEM SECURITY, BUT I DON’T WANT ME OR MY GIRLFRIEND TO HAVE TO TYPE-IN THAT TEDIOUS, HARD PASSOWORD, JUST TO LOG-ON TO PLAY A GAME, AND LIVE-CHAT WITH AUNT LINDA ??
You’re in luck.  There are at least a few ways to set this up on a Linux system.  I’ll describe at least one of ’em for ya, here.
As I may have said in the foregoing text, I have set-up my semi-elderly mother with a Linux install, to my old HP Pavillion slimline tower (AMD 32-bt), because she had so much trouble understanding how to safely run in WINDOWS.  Really, if we had kept-at-it, we might have got her going from a limited-user account, which could mean fewer pop-ups and “inticement-ads” that divert you to another webpage—sometimes only if you position the pointer over them.
And I’d probably have had to put No-Script on her browser, which might have been a hindrance to viewing content on certain sites she likes to visit, and this add-on can be hard to toggle on-and-off.  Especially for an older woman who never even had an * electric * typewriter * in her life.  And I noticed it took mom a comparatively long time to learn the controls of her new microwave.
And so, having recently started in using Linux (Ubuntu) myself, one day when I had some time, I booted the distro I was then using (Karmic Koala) on the machine, and messed-around with it live for a little while, just to try to test that it would really work on her hardware.  Stuff seemed to work fine—FireFox, games, OO documents, tabbed-browsing, folder navigation, &tc.)—so I closed all open programs, and DOUBLE-clicked the Install icon on the desktop interface.
It took like about an hour and a half, and I had to stay there, to answer some questions it would occasionally put to me.
When I was sure it was good-and-done, I took the option to shutdown (which Ubuntu presents), and, after leaving it off for a few minutes (just to make sure everything would be properly applied), I booted-back up, and it automatically landed me at the blonde dunes of Karmic’s desktop.  And it’s been running ever since.  Mom hasn’t had to call me for tech support in many months.  Of course, all she does is surf the web, or perhaps occasionally watch a documentary I downloaded for her and saved as a playlist.  But these limited things are about all she is interested in—for now, anyway.
The biggest stumbling block was logging-on with her password, just as it was a stumbling-block for her in WINDOWS.
The fix was for me to set Ubuntu’s log-in so that it does not prompt you on boot-up.  [NOTE that this way of running is not officially sanctioned by either Cannonical, Ltd./Ubuntu nor most of the Community-at-large.  But if you’re just using the system for limited purposes (like my mum), and nobody visits your apartment who is likely to fire-up your computer when you’re not looking, then using desktop Linux this way is probly still more secure (much more) than * any * configuration of WINDOWS].  It will still prompt me for the password, though, if I am over there to install some software for her, or just to see how Ubuntu is, or to boot the WINDOWS partition to update it, which I do about every six months.
This way, the system still has some of its password security, in that a malwre might be stopped by login manager, if it tried to install from a webpage.  Most such malware is coded for WINDOWS machines, anyway, and so doesn’t seem to affect a Linux system.  I might try installing a sandboxer-program to her browser too, when my time permits.  She does not download anything anyway, and has made it clear that she is not interested in learning how.
Yes, you can download and install programs in “generic” Linux code (“compile from source”, & related extra-curricular activities).  But, except for a  few of the “big, well-known” apps—stuff like, say, Real Player, FireFox, or Google Toolbar—you will find that things generally do not install themselves in Linux, nor do they have those “installer-wizards” you’re used-to in WINDOWS.  So code just does not install (usually) on a Linux system the way it is able to on a WINDOWS system.  Instead, it needs more help from YOU, UNLESS you are installing it from your distro’s repositories (“Repos”), which amounts to special servers reserved for this purpose.  And the 15,000 or more programs in there are monitored to some extent by the creators of the distro, and, perhaps more-to-the-point, * by the user-community at-large *.
No, I am not saying it is impossible to malware or black-hat hack Linux.  There isn’t much of a way, under the sun, to stop a determined hacker—short of maybe not having internet access at all.  I am only saying that Linux is * more * secure, generally, than WINDOWS.  Some of this is because Linux has a sufficiently different underlying structure, and some of it is for other reasons—like the user-community, which I find to be more tightly-knit in Linux than in the WINDOWS-metaverse.  Some might even say the Linux community is “Too tightly-knit”—WHERE IT COMES TO THE CODE-LEVEL.  If you ever progress to the point where you are designing your own program, you may experience some of what I mean.
But as far as the rest of us are concerned—ordinary * Desktop * users *—Linux is easy enough to operate, once you “get-it-down”:  and the user-community is friendly and helpful, generally speaking.
76. Entry 76:  BACKUP ON LINUX:      For backing-up just your user-data in Linux, I tend to think in terms of free programs like FreeFileSync, Deja Dup (pronounced “Day-zha Dupe”), Grsync, Meld for Linux, or Lucky Backup.  A good article as to Deja Dup may still be available at http://www.linuxtoday.com/infrastructure/2011112200539OPSWUB .  Myself, I am currently leaning toward FreeFileSync.  This seems like the best one.  If I find any significant problems with it, I’ll try to post an update here.  Mind you, as with the  rest of free Linux, it “comes with no warranty”.
There are many, many other backup programs for Linux systems, and most of ‘em are free.  YOUR DISTRO MAY ALREADY INCLUDE THIS FUNCTIONALITY, SO LEARN A LITTLE ABOUT IT FIRST.
Ubuntu already comes with “SimpleBackup”/S-Backup.  I think another good one to look at would be BackInTime.  There are a whole slew of backup programs for Linux.  But I think BackInTime lets you restore to an earlier configuration—much like WINDOWS restore-points.
If you want to make a backup of an install of Linux, you can create a ghost-image with P.I.N.G. , which you can download for free; or, you can use any one of several utilities/discs that you can d/l free, or that you can buy.  RemasterSys is also a good way.  I guess Parted Magic (not to be confused with Partition Magic, which is a commercial disc) would be another one.  Other ppl will post here, which is the nice thing  about a blog.
77. Entry 77:  HUNG WINDOW?  FROZEN APP?        In WINDOWS, you’d use the Task-Manager.  I’ve found that, even in W7, this sometimes doesn’t work. And it can take up to 5 minutes to close the problem program.
In Ubuntu, at least, there are different ways to deal with this situation, should it arise.
One way is to open a Terminal window, and type “ps -A”. This will list all the processes running, with the name of the program, and a 4-5 digit “code”number next to it. Then, type “kill -9”, without the quotes, and the “code number”, and hit enter. This will kill the app, no questions asked. It will not ask you any questions, or give you any caveats. That program should now be shutdown, until the time you decide to reopen it.
This will NOT work with things like Apache, or other process daemons.  If you’re not sure, just try to kill it.  If it doesn’t die, then it is probably a daemon.  You will have to find the actual documentation on a daemon in order to stop it.  [This last, however, is something that you’d have to worry-with if you were using your linux pc as a server, too. ]
Generically, one could just try the usual “kill” command.
In a Terminal/console/command-line—whichever term you prefer—type “kill” (without the quotes, of corse), and then a space, and the “code number” of the  process (sometimes called the “PID”, in Linux-lingo).
There is also the generic “killall” command:
killall program_name
Kill program(s) by name.
In the Mozilla SeaMonkey web-browser, ctrl + w seems effective, where one has opened multiple tabs (by means of “file > new > new tab:  once more than one tab is being used, it’ll give you a “tabber” icon, up in the upper-left of the top ribbon).  This probly also applies to most versions of FireFox, as well.

78. Entry 78:  In Linux, there are often WAY* S * to do things, rather than ** the ** way to do something (as in WINDOWS).  Actually, there are usually several options in MS Windows, where it comes to doing most operations.  But as far as most users are probably concerned, there is a “conventional”, “way that most of us do it”, and then other, less-used options.
In desktop Linux, by contrast, it can be a bit harder to pick-out a “conventional”, “standard” option.
Linux is a lot like practicing Law:  they teach you in pre-law (or at least they used-to) that “For every wrong (in jurisprudence), there is a remedy”.  Linux is a lot like this.  For every software problem (with some noteworthy exceptions), there is a remedy in the Linux metaverse (of free software, forums, community, &tc.)  Somewhere.  Try googling the issue by phrasing it in different ways.  The tabbed-browsing feature available in most modern web-browsers (even Internet Explorer) is handy in respect to this.
79.  Entry 79:  FILES-SYSTEM AND FILES-FORMATTING:
We’re getting a little deep into it here, but I thought I’d throw this in, though one may not need it to start messing with desktop Linux.
I’ll try and be clear.  I’ll try.
To try to be plain, there’s formatting, and then there’s FORMATTING.  (That should clear it up.)
There’s “Disk formatting”, to coin a term; and then there’s * files formats *.
In the beginning, like way back in 1976, there weren’t no harddrives (that were smaller than a breadbox, anyhow), but only these weird flat little plastic things that we today call “diskettes”—if you’ve ever even seen one.  (Which I doubt.)  Back then—when Jimmy Carter was in the White House—they were called by their proper name:  they were called   * Floppy Disks *.  But if you held one in your hand, it wasn’t “floppy”.  That condition would only occur if you were to break open the plastic case, and handle the recording medium inside.  And this stuff would “flop” in your hand.  Which is how they got the name, or so I’m told.
And that’s about all there was.  And each one only could hold a maximum of a paltry * 26 Kilobytes *—or some ridiculously low number like that.  (I just made that up—but I may in fact have erred on the side of being too               * generous *.)
And so every operating-system that there was, was in fact “portable”, in that it could be stored on these disks (magnetically), and run on a small computer (or maybe a large one), like we can run some stand-alone application to-day from a cd.
That’s why we call everything a “disk”—even to-day.  Whether it looks like a disk or not.  If you run Linux from a USB thumb-drive, it is said to be running from a “USB disk”.
So any storage medium or volume is still called a “disk”.
This is also more than just latency in language:  the geometrical-term “disk” carries with it the implication that information somehow written to its surface(s) can be retrieved * AT RANDOM *—unlike magnetic tape, where you were pretty much forced to start at the beginning, and patiently wait until it spooled to the place where the desired file was located.
Okay, if you’ll forgive me for being wrong about the details (floppy capacity, floppies being the only media, 1976 as an exact year in this context), then you get the point, conceptually.
So these “floppies” had to store all kind of data.  (Files were a lot smaller then.  They pretty much had to be.)  They had to store your data—like documents and spreadsheets; and they also stored and loaded the                 * operating system * itself—though not on the same floppy, of course.
(I think I read somewhere that around 1992, it could take like 24 floppies just to load Slackware Linux to one of the harddrives they had back then, and this with no GUI.  And it could take like ¾ of a day.)
So you had like one set of floppies for your operating-system, which I guess machines like the original Apple II loaded into RAM; and another set for your data, which you were working-on.
Enter data formatting.
At about the time IBM partnered with Microsoft (c. 1980), it seems IBM’s engineers decided that there would be a way of dividing-up these floppy disks into * sectors *, so that specific files or a part of the operating system could be pin-pointed and retrieved, and without the whole surface of the disk having to be read—which of course would take longer.
So they devised a means—a “schema”—and this is known roughly as the “Master Boot Record”(MBR)—sometimes referred-to in * this * context as “MBRDos”.  And this worked by means of a Table—just like in math-class, where you see how you can take a ruler and a pencil, and mark-out a Table with columns that go across, and columns that go up-and-down.  If you put somethin’ anywhere in the grid, then you can “locate” it by means of the intersection of the columns as its co-ordinates.
Now imagine a table laid-out on a round table-top.  Just as the lines of latitude and longitude on a globe.  And so this became known as the File Allocation Table, or FAT disk format.  Now, * vFAT * (FAT 12, FAT 16, FAT 32) is actually the “normal level” * disk * format *, that runs ** on ** top ** of ** the low-level “MS/IBM/DOS/MBR/FAT-schema”, or whatever you want to call it—the latter “baked into” the platters of most harddrives.
Originally vFAT (volume-FAT) could only manage to work with a 12-bit address-bar.  But as tech improved, it got up to 16, and then 32-bit.  And this “FAT 32” was then used for a long time.  It’s still the default disk (“volume”) format for certain stuff, like USB thumb drives.
The names you could give to your files, too, were limited to something like eight (8!) lousy characters, which really sucked.
When Windows came along (after the original MS-DOS), it ran off early harddrives (which were also being perfected), and these were formatted (in the lowest level) to (you guessed it), the IBM-MBR/FAT/MSDOS schema.
MBR = “Master Boot Record”.  MBR is always the first “volume” on any disk that has the low-level Microsoft/IBM/DOS low-level schema.  And this means just about ** any ** harddrive in the entire world.  The MBR (“Master Boot Record”) is always exactly 512 * bytes * in size.  * NOT * ** Mega-bytes **.  Just plain ** bytes **.  But * MBRDos *, by contrast (or whatever term you settle upon) is really a very low-level disk format (“device-format”, in this context), and has to do with the whole disk—not just the boot sector, or the various “volumes” (partitions) on the harddisk.
Remember that a harddrive has spinning platters inside, that are shaped like a disk.  So did the aboriginal “floppies” we used to use, back in the 1980s—though they spun considerably more slowly.  This is why the term “disk” crops-up so much.
The IBM-MBR/FAT/DOS “schema” is (de-facto) a very low-level disk format that is neither software nor firmware:  it is * both *, and it is * neither *.  (Don’t they make things confusing?)  This “MBR-Dos” ought not be confused with vFAT (most commonly FAT 32), which is a * disk * format (volume format)—as is NTFS, or .ext2, or .ext4 in Linux.  FAT 32, NTFS, and .ext2 are  * disk * formats (volume formats) that run **on ** top ** of ** MBR-DOS low-level harddrive format.
Individual files—each with their own formats (such as Mp3, .ogg, .exe, .doc, &tc.)—in-turn run on-top of a disk-format (“volume format”) like NTFS, FAT 32, or .ext4.
(Linux does not need to assign extensions to files to denote their type, but often does so anyway, as a courtesy to Windows machines with which it may have to interface.  And for the benefit of * humans *, who expect to see files-type extensions at the end of filenames. )

So this is sort of where stuff like .doc, .exe, .bat, and so forth had their genesis:  as harddrives became practical for small computers, Microsoft and IBM (who were business partners in the early 80s) decided to use   1) the “MS/IBM/DOS/MBR/FAT-schema” very low-level harddrive format—or MBRDos for short (* my * term);   2) the vFAT disk-formats (FAT 12, FAT 16, FAT 32, and finally, NTFS—which still has a lot of “family resemblance” to FAT 32, if you look deeply); and    3) the various files-types (.doc, .bat, .exe, &tc.), which were/are used to allow a MS-DOS system to understand what sort of file it was looking-at, and enable the user (human) to do something with it.
NOT EXACTLY.  But pretty close.
Of course there was “data formatting” prior to all this; but I am using language to illustrate a point.
Now enter Linux.
Linux as you may know is based-on UNIX, which is a mainframe system developed at the old AT&T Bell labs.  The ancestor of this was Multics, an earlier mainframe system which AT&T bought from another company.  So all this “* nix” stuff has inherited its own files system structure.  (And it is superior, frankly, to vFAT/DOS and even NTFS.)
But by the time actual * Linux * could enter, all the harddrive makers only knew how to build their products with the MBR/FAT/DOS low-level disk-format.  Or that’s all they were interested in.
This even became even more entrenched, over time.  By the end of the 1990s (or so), MBR/FAT was being litterally * baked * into the platters of harddrives at the factory, and it is next to impossible for an ordinary person or small company to alter this.  Even with disk tools.
I guess you call this “market hegemony”.
Now, I am informed that, in the manufacture of more “modern” harddrives (say, after circa 2000 or so—depending on the manufacturer), the low-level disk-formatting (the “MS/IBM/DOS/MBR/FAT-schema”, or whatever you want to call it—* distinct * from vFAT (FAT 12, FAT 16, FAT 32)—well, that * low-level * sucker is ** mapped **, instead of “burned-in”, to the spinning platters of the harddrive.  What does this mean?  It means, roughly, that, while the “MS/IBM/DOS/MBR/FAT- underlying schema” is still difficult to * replace *, it is now a kind of “virtualized” environment; and so it could be (perhaps) re-mapped * outside * of a factory environment.  For example, a modern harddisk sold as 160 Gb. Might in reality have 240 or more Gb. available—but just 160 of that “mapped” at the factory (as “MS/IBM/DOS/MBR/FAT low-level schema”).  Why?  Because it is then possible to re-map around sectors that become corrupted/damaged.  This is possible, I’m told—but probly not with normal disk-tools, at the time of this writing.  No, instead, you’d have to download special tools from the harddrive’s manufacturer.  And learn how to properly use them.  Many are free, and come with instructions/instructions are available online.
There are other low-level formats available for harddisks—like GPT [sometimes called GUID Partition Table—hence G(uid) PT]; but this is beyond my scope, at the time of this writing.  Further, I’m not sure whether GPT * replaces * MBRDos, or just simulates running on-top of it.  It is also quite a challenge to get Windows to boot from GPT, so I’d think this’d be for advanced people.  Only the newer 64-bit versions of Windows can boot from GPT, and even then I’d think you better know what you’re doing.  Linux, on the other hand, seems to do pretty well on GPT.  And GPT, apparently, can handle some of the new drives, that are too large for MBRDos (multi-terabyte drives).
So I guess we could say that it “breaks-down” like this:
Your harddrive (or thumb-key, or SSD); > “MS/IBM/DOS/MBR-FAT low-level schema”; > the format(s) on your disk(s) [these would be the “volume formats”, a.k.a. “partition formats”:  stuff like FAT 32, NTFS, ext2, ext4, ReiserFS, btrfs, & so on; now, your os (operating system—Windows, Ubuntu, PinguyOS, what-have-you) “rides on top of one of those ‘disk formats’ (file-systems, “volume formats”)—Linux will use the “native Linux” ones (ext2, Reiser, btrfs, & tc.), and Windows uses NTFS or vFAT (FAT 32, FAT 16); now, > inside your operating System (Windows or Linux) are your individual files:  .doc, .odt, .jpg, &tc.

Anyway, maybe it’s not so bad.  We have enough competing standards in the pc metaverse as it is.
So Linux (and other stuff, like Menuet) have learned to run off of this disk-format (volume-format—vFAT—as a frugal/P.M.I.-type install) and it’s underlying “MBRDos” structure, since day 1.  And to run from a native Linux volume-format where desktop Linux is put on the harddisk as a “traditional-type, full-install”.  Even when Microsoft improved the * disk * format/volume-format in the mid-ninties (actually starting with a clean sheet, sort-of), and shifted from vFAT (FAT32) to “NTFS”, Linux was able to adapt (by degrees), and to-day can even * RUN * off of a drive formatted to NTFS!  (* If * we * do * some * hacking *:  Whether or not this latter is actually a good idea, though, is a subject for another webpage.)
And Linux can handle pretty much any files, stored in any files-format, whether natively UNIX or natively Microsoft.  [.docx might be a possible exception, but at the time of this writing (Decmber of 2012), MS .docx seems well supported in OpenOffice 3.2 for Linux, and LibreOffice 3.]
But Linux can also store individual files in some native * nix formats, or some “international” formats, which have co-developed with the * nix metaverse.  I guess an example of this latter might be .odf/.odt, which is the default documents format for documents-creator programs that are often featured in Linux, such as Open Office Writer:  (though this format developed independently of Linux, I posit that there was co-latterally a good deal of communication with Linux developers—and some of these were probly also the same people).  This .odt format is from the ISO—the International Standards Organization.  Or at least that is my understanding.  You can still save documents to .doc in Open Office Writer/LibreOffice in Linux, however.
Today’s Linux can run off of a drive formatted to FAT 32, or NTFS (as a compressed/”frugal” type install), or a native * nix disk format, like .ext2, or .ext3 or .ext4:  or it can run off of an even more advanced disk format, like ReiserFS or btrfs (usually pronounced “bee-tree F-S”, or else “butter-F-S”).  These that are * after * .ext2—such as .ext3, .ext4, Reiser, and btrfs, are “journaling” file systems.  So these “post-ext2” formats can recover data better in the event of some disaster.  The .ext2 format, however, is still often thought to be more desirable for encrypted disks, because it is much easier to encrypt.
But all these just mentioned must run “on top” of the IBM-MBR/FAT/DOS harddrive schema (“MBRDos”), which is still the low-level, “master” allocation.
So if you have a video file of, say, 14 minutes in length, and it is stored on your Linux system, then it may well look [ something ] like this:
Your Harddrive > MS/IBM-MBR/FAT low-level schema > .ext4 > Linux operating system > My videos directory > .flv file > your media player > images you see on your screen when you play the file.

So much for Files-Formatting.

A File-System, on the other hand, for our purposes, is just the same as an operating-system:  like MS WINDOWS, or Slackware Linux, or Ubuntu.  Another term sometimes used is “software stack”.  There is also a “network stack”, which is just a group of inter-co-operative programs       * inside * an operating system, which facilitates connection to other computers.  So this “network-stack” thing is like a smaller operating system * inside * your WINDOWS XP or Ubuntu, which allows you to connect to the internet without as much hassle.  (When it is working right, of course.)
NO, THE INFORMATION I HAVE JUST RENDERED IN THIS ENTRY IS NOT EXACT, IN TERMS OF BEING TECHNICALLY CORRECT.  I have perhaps only confused both you and myself with this entry, and perhaps not laid-it-out correctly.  BUT YOU GET THE ** IDEA **, WHICH IS MAINLY WHAT I’M AFTER.
80. Entry 80:  I HAVE SOME FILES STORED ON A USB THUMBSTICK. HOW DO I OPEN (MOUNT) THIS THUMB-KEY, AND HOW DO I SAFELY REMOVE IT?
While your Linux desktop is up and running (or even usually from a cold-boot), an ICON will normally appear on the desktop screen, an this is usually followed by a menu popping-up in front of you, and often this is already displaying the contents available.  If the contents window (really an invoking of your files-manager) does not appear, then RIGHT-click the icon, and there’ll be an option to open the thumb.  If the icon isn’t visible to you, click your desktop switch, and that may show it.  Remember that Linux also has btween 2 and 6 desktops (or more) running—though we normally just use the first one (default)—at least until we get really used to things.  Clicking in blank space on your desktop, and then scrolling the mouse wheel may flip you through the desktops available.
If you cannot find an icon for the USB thumb-drive, then go to your menus, and click your files-manager.  In Ubuntu and its variants, this is usually Nautilus or Thunar or maybe PcmanFM.  In KNOPPIX you have Konqueror, which can also be pointed at the web with a click, and used to browse the internet.  In Puppy Linux, most builds (after, say, 4,3.1 & later) use ROX-Filer, a good program which appears as a series of icons on the desktop—though you can access the whole files-tree through the “boss” one.  Puppy also comes with P-Mount, which is a drive-mounter/un-mounter.  Ubuntu and KNOPPIX each have such a mounter/un-mounter.
Anyway, opening your files-manager in desktop Linux is effectively opening a key-way to all the drives and files on the system.  (In recent builds, anyway).
If this doesn’t work, you can resort to command-line.  There are instructions on the web; but I tend to think that by now, we shouldn’t have to.  We should be able to accomplish something as rudimentary as this graphically.
To “Safely Remove” the thumb-drive, close the documents or other files from the drive that you might have open; close the contents-window that might have been minimized to the Tint2 Panel (task-bar along the bottom), and RIGHT-click the drive’s icon.  In the context menu that pps-up, you can click “safely remove”, or “unmount”—which are really the same thing, but some distros just put it in there both ways.

It is probably worth noting that, while I have found that the little light on my USB thumb-keys and my external, USB-powered 500 Gb. harddrive would be turned-out when I would unmount/click on “safely remove” in BOTH Windows 7 Home Prem AND Ubuntu 10.04 with GNOME 2.x on my big lappy, “safely remove” DOES NOT turn it out on my Windows 7 Starter on my AAO netbook, OR my current Linux desktop on the same netbook (this would be Linux Mint 13 XFCE Ed.  This fact apparently does no harm, and does not lead to data-loss (according to my own experience and research, at least).  But I suppose it could be a bit disconcerting, to a user that is used to GNOME 2,x turning the light out, when “safely remove” is clicked.  For some advice as to this matter–if you are paranoid about data loss/corruption, there is an entry in my document

“The Most Important Things a Linux Desktop Newbie Needs to Know”, the entry on “SAFE REMOVAL OF EXTERNAL USB HARDDRIVES and THUMB-KEYS”.  It is posted on this blog.

81. Entry 81:  MISCELLANEOUS ISSUES WITH DESKTOP LINUX:

INSTALLING DESKTOP LINUX TO VARIOUS SPECIFIC LAPTOPS IS PROBLEMATIC, AND CAN REQUIRE YOU TO COMPILE A “CUSTOM” KERNEL,
and this is usually because of issues with Linux and acpi.  Tower and other non-mobile machines do not have acpi, so this is not an issue for them.  But I think I covered this already, pretty well (above).

FUNCTION KEYS (Fn) ON LAPTOPS/NETBOOKS
It can be hard to get these to work.  Sometimes impossible, but on         other hardware, it may work right away, without you having to adjust         anything.  I don’t use these keys myself, so it’s no big deal for me.          One might also check-out Touchpad Indicator.
https://help.ubuntu.com/community/SynapticsTouchpad
This can enable you to easily control your touchpad.

REGRESSION-BUGS:
This is a name for the annoying occurrence of features (like audio) that were working fine being suddenly rendered broken after an upgrade to the next release (or a newer release) of the distro.  You’ve carefully upgraded from, say, Ubuntu 9.10 to 10.04, and now your wifi has stopped working.  Unfortunately, these sort of issues are hard to fix at the creator’s end, primarily because of limited funding.  Too bad.  Windows and Microsoft enjoy a free lunch, because every hardware vendor in the world tests their products for ms Windows, because ms Windows is dominant.  The poor ole Linux community, by contrast, has to pay for all of this testing itself, or be content to ship the next version of Ubuntu (or Fedora, KNOPPIX, &tc.) not knowing if it’ll really perform on certain hardwares.

DRIVER AVAILABILITY LAG-TIME:
This is sort of in the same vane as the above entry.  Windows used to have a big problem with this too.  When the first versions of Windows XP hit the stores, there was a huge lack of drivers.  It took many, many months for certain hardware components-makers to catch-up, and release working drivers (for certain sound cards, printers, even graphics displays).  As Windows became even more dominant in the marketplace in the early 2000s, manufacturers of these components took notice, and were more comfortable in justifying the business expense of putting extra people on writing these drivers, and keeping them up-to-date when ms released new service-packs.
Where it comes to modern Linux distros, not as much headway has been made; it can still take many months for an open-source driver to be made available for the latest model of Nvidia or ATI graphics cards, or Broadcom wireless cards. The manufacturers of a few other such components do not see fit to have in-house support for Linux at all, in large measure because these manufacturers just do not see fit to put out much effort to co-operate with Linux, and help-out by at least giving Linux coders some feedback, where it comes to engineering a driver for the latest chips.  Often, though, this doesn’t stop the Linux Community at-large, and a “native Linux-FOSS” driver gets reverse-engineered, and published on the web (or it might already be included in your Linux distro’s default repos).  [UPDATE:  as of roughly the spring of 2012, the major Linux distros and Broadcom seem to have come to accommodation, and Ubuntu 12.04 LTS and the others now seem to come * with * the Broadcom drivers installed.]  Why should these vendors cooperate?  They’re a corporation; and Linux–while it makes money–is a lot less well-heeled, because Linux isn’t profit-driven.  So I guess some of the bosses at these companies perhaps just don’t see the point in co-operating with a “weird” operating system, which has only about 1% of the end-user market.  Or they think if they do, they’ll be giving-away the store.
Toward the * OTHER * end of the spectrum is Atheros, a brand of wireless-card which has a good reputation for supporting Linux.  (Atheros makes other cards too.)  Let’s remember that * nothing * is perfect, but Atheros, generally, has a reputation for * wireless cards * that work with Linux distros.  Also Intel cards usually work “Out Of The Box”, as the saying goes.
Recently, I came across a web-page that gives us indication of * which * WI-FI cards are supported in Linux.
https://help.ubuntu.com/community/WifiDocs/WirelessCardsSupported
This is worth a look—but just as with * all * things Linux—you need to check it some with your own research:  don’t take anything at face-value, even from an * official * Ubuntu page.  Make sure the info is up-to-date, and is confirmed by at least one other source.  This practice applies to Windows, too.  But it is especially important in desktop Linux, as there is no System Restore.  There * is * Back-In-Time for Linux:  I’m informed that this app can give an Ubuntu user the functionality he/she may be used-to with Windows’ “Restore Points”.  But I haven’t had a chance to try it out, at the time of this writing, so I can’t comment as to how effective it is or isn’t.  Therefore, without knowing a reliable equiv for Windows 7’s System Restore, we should probably assume that:  “Any changes to a fully installed desktop Linux distro—beyond superficial things, like personal settings, & rel., may be tough for a novice to undo, without a complete re-install (of Linux).”  In any case, installing/re-installing desktop Linux is not that hard.  At least no once you’ve discovered a distro that really * likes * your hardware & components—i.e. which “runs good” on your equipment.
And let’s not forget USB-dongles:  these are still on the market, at the time of this writing (spring 2012), they have come-down in price (somewhat, in adjusted-dollars), and they often work for your Linux install.  There are two brands I see often—Belkin and Netgear.  The one I’m using right now is a Netgear.  Unfortunately, there seems little brand-specific reliability, where Linux support is concerned:  respective of these two brands (at least), it seems to just come down to the specific model.  Research it.  Take your computer to the store, and see if the salesperson will let you try it.  Many retail outlets today have wifi available—even if only for their own purposes.
There are portable routers too, like the kind that plug-in to your car’s DC outlets.  Linksys makes a good one, that has a good record of Linux support (Linksys WRT-54G).
Modern desktop Linux will also let you change Network Manager for another program—most notably “wicd”.  You would probably need to disable Network Manager after you downloaded wicd, though.  Note that wicd comes with Vector Linux by default, and is available in Ubuntu and Linux Mint repositories.  As with any such program change, research it before you do anything.
The work-around for really difficult hardwares is simple:  research the hardware before you buy a machine that you intend to use to run Linux, and actually you might just be better-off with a machine that’s pre-owned, and for which the Linux community has therefore had time to compile drivers.  Remember that Linux is almost always * faster * on any machine, so you probably won’t lose much of a speed-advantage relative to the latest version of ms Windows, running on the latest hardware—if you lose any speed at all.  Pre-owned computers are cheaper on Amazon anyway, than new equipment.  Just make sure Windows is okay when you unpack the machine.  If not, send it back.  [Remember that, while I haven’t booted my Windows in * months * now, I still don’t advise that you dump your ms Windows install.  Shrink it, yeah—to make room for Linux, and, while we’re at it, perhaps a nice big FAT32 partition to store/backup videos, music, &tc, as this is easily accessible to both Windows and Linux.  There’ll be times when I’ll still need Windows, and so will you.]
An added tip is to try one of the “bigger boys” among Linux distros—Linux Mint, PinguyOS, AriOS, KNOPPIX DVD-version, others:  these 1 Gb + distros that you have to burn to a DVD often come with extra drivers, already rolled-in and auto-configurable for you.  Some distros (like Mandriva One) also come with a “cutting-tool”, which allows the Linux to locate the Windows driver, copy it, and then help you set that up to work on the Linux partition without hurting Windows.
BATTERY-MANAGEMENT:
Even if you can get desktop Linux working on your laptop or moby without completely disabling ACPI/APM (and often it * is * possible), you may well find that Linux is still not the equal of most WINDOWS installations,  where it comes to squeezing maximum use-hours out of your power-pack.  I NEED TO INTERJECT HERE, HOWEVER, THAT A VERY GREAT DEAL OF TRACTION HAS BEEN MADE ON THIS ISSUE IN THE LATE 2000s, in Ubuntu and more-to-the-point distros based-on Ubuntu.  As well as several RPM/Red Hat-type Linux.  My solution for this (if it’s a problem for you at all), as I may have already said, is just to upgrade the power-pack to one with more cells.  It costs little, if you are one who is likely to keep the laptop/device for the duration of its usual lifespan.  And it usually makes your user-experience better—even in WINDOWS.

NVIDIA and ATI graphics cards:
I’m sorry, but I just haven’t had time to re-edit and finish writing this entry, as to difficulties sometimes experienced with the very newest Nvidia and ATI GPUs.  I will try to get this done by-and-by, but I have a lot of other stuff on my plate right now.
Broadcom wireless/wifi cards:
These were a much bigger bug-a-boo prior to releases of desktop Linux circa 2011/2012 (which roughly coincides with the timeframe of this writing).  Broadcom drivers seem to be “not exactly closed, and not exactly open-source”.  Google a query something like “the histrory of Linux and broadcom support”, if you’re curious.  But really, the major distros seem to have just found ways to get right-around this “issue”, in recent months/years.  Particularly those distros that are too big to fit on a single cd—you know—Ubuntu re-hacks done by distributed cyber-communities (like Linux Mint)–or by individuals (PinguyOS, PCLinuxOS (this one is Mandriva/RPM-based), perhaps CrunchBang, others).  And the distro-makers have had significant help, apparently, from the new (2011) Linux 3x kernel, and its larger set of installed modules.  All these taken together, Broadcom wireless support just isn’t the issue it used to be.  My downloded and burned DVD of Linux Mint 12 booted and ran on my netbook with Broadcom n-series wireless card, and performed just like it was * made * for that machine.  No muss, no fuss, no worries.  And I did not have to download and/or install any drivers.  Mandriva Linux is another example:  I have not tried it, but I am told that the Mandriva ppl have built this nifty little tool into their distro, that locates your WINDOWS Broadcom driver(s), and makes a copy(ies), and then uses these to interface with the card.  There is also such a tool available for Ubuntu, by the way, but it is apparently not included by default; you would need to add it to your system.  Even then, this U-tool only appears to take you half-way:  it will copy the drivers to a location you can see, but the rest I esteem must be accomplished manually.  See the section on “ DRIVER AVAILABILITY LAG-TIME”, above.
ACCORDING TO SOME PEOPLE, VOICE RECOGNITION (IN BOTH             DIRECTIONS) IN LINUX MAY BE BROKEN, AND THIS MAY HAVE BEEN         FOR SOME TIME.
LINUX (and MAC) STILL REQUIRE YOU TO DO A FEW, VERY SELDOM         NEEDED THINGS FROM THE TERMINAL/COMMAND-LINE.  BUT REALLY,  THIS ISN’T MUCH OF AN ISSUE ANY MORE, FOR * MOST * USERS.
I’ve been using Ubuntu 10.04 as my daily os for productivity for months now, and I haven’t needed to resort to command-line in all that time.  [UPDATE:  I’m now on Linux Mint 13 XFCE Ed., fully a-year-and-a-half after first “finalizing” this article—and I * still * have not actually * needed * the command-line.  I only use it for “experiments”—because I’m a bit of a tinkerer.]  I have used the command-line/Terminal in that time essentially because I wanted to learn it—not because I really needed to.

DESKTOP LINUX IS STILL MISSING EQUIVALENTS FOR CERTAIN             WINDOWS GRAPHICAL SOFTWARE.  There are audio editors, but no  really superb one, as in WINDOWS.  It is really hard to develop this type of application anyway, because the developer team has to have not only really good programing skills, but also they must be truly excellent musicians, perhaps with perfect pitch recognition.
Further, there appears to be no really acceptable AUTOCAD/3D equiv. for desktop Linux.  An ordinary person can be more than satisfied with modern desktop Linux.  So are many professionals in intermediately technical fields (accounting, family dentistry, photography, journalism).  But if you’re a graduate architect who needs to do some sophisticated  3-D modeling, or perhaps a structural engineer with similar             requirements, you may very well find yourself forced to back to the WINDOWS platform, in order to run a program that cost the equivalent of the gross national product of some small third-world country.  UPDATE: I stumbled on a website, which seems to show some serviceable native Linux CAD/CAM apps:

I stumbled on a website, which seems to show some serviceable native Linux CAD/CAM apps:
–>CAD/CAM:

link: http://www.techdrivein.com/2011/08/8-best-cad-apps-for-linux.html
Linux also apparently lacks the same level of HTML editing and web- page design  as is possible in WINDOWS, with pay-for applications.

In the history of its existence, desktop Linux has lacked certain lesser-used but critical “business-ware”/office-software. BUT THANKS TO A RECENT MERGER AND SOME OTHER DEVELOPMENTS, THIS SITUATION APPEARS TO BE CHANGING RAPIDLY. There now seems a serviceable equivalent for Microsoft Exchange, and quite a few other programs that business people need. SO YOU MAY NOT NEED TO SAVE YOUR WINDOWS INSTALL AFTER ALL!

Some links to support this notion:

Active Directory touts a robust, scalable, and secure place to centrally store your user identities, passwords, and other server and workstation settings.

This is where Quest Authentication Services can help. “The 12 Critical Questions you need to ask when choosing an AD Bridge solution.”

Source:

http://communities.quest.com/community/iam/blog/2013/04/03/so-you-own-linux

Microsoft Exchange:

http://www.linux.com/learn/tutorials/338482-microsoft-exchange-alternatives-for-linux

http://www.smallbusinesscomputing.com/biztools/article.php/3932591/Top-5-Open-Source-Alternatives-to-Microsoft-Exchange.htm

the Microsoft Project prog.,

http://www.maketecheasier.com/5-best-free-alternatives-to-microsoft-project/2012/01/14

Adobe Acrobat Pro: *nix now has PDF Studio

ERP (so-called Enterprise Resource Planning software) :

see 10 of the Best Free Linux ERP Software – Linux Links – The Linux …

http://www.linuxlinks.com/article/20091129070817552/ERP.html‎

Mar 31, 2012 – 10 of the Best Free Linux ERP Software. Enterprise Resource Planning (ERP) manages the information and functions of a business.

Let’s try to recall something that seems to have been forgot: Not so many years ago, even mid-sized and small companies would just * hire * a programmer to create a needed utility for their systems—even if they were using * Windows * NT *. There was nothing stopping business people from doing this back then, and there isn’t now. A business using a UNIX or Linux system could just * pay * somebody to create a needed program.

MANY DISTROS HAVE A HABIT OF “SHUFFLING” THE PROGRAMS IN  THEIR SUITES OF INSTALLED, DEFAULT APPLICATIONS, ACROSS  RELEASES.  I’m talkin’ to YOU, Ubuntu.
I was good-and-used-to the video editor I had by default in Ubuntu 9.04.  Now after I upgraded to 9.10, I find I have been switched to a different one.  A few other key apps may have been swapped-out.
But this is not a big enough deal (IMHO) to seriously throw a new user.  The biggest deals—as I see them—are the * learning-curve, coming from WINDOWS *, and * Hardware-compatibility issues *.  Other stuff—if it befalls you at all—is much more easily solved in 2012 than ever before, and is competitive with ms WINDOWS.

RPM-BASED DISTROS (LIKE FEDORA AND MANDRAKE) USED TO BE   BESET BY “DEPENDENCIES PROBLEMS”:  but this issue has effectively  been solved for several years now.

THE DEVELOPERS OF THE VARIOUS DESKTOP-ENVIRONMENTS (GNOME, KDE, OTHERS) ARE FOND OF MAKING SIGNIFICANT CHANGES TO THE APPEARANCE AND OPERATION OF THESE INTERFACES.  This  has been exacerbated in 2011 and 2012 by the “netbook-and-small- device-revolution”, which is still under-way at the time of this writing.    But they were at-this (somewhat) before the netbook-craze.   It’s  nothing you can’t overcome, but really, it is excessive and annoying      (depending of course on who you’re talking-to).

FONTS AND FONT-RENDERING IN LINUX USED TO BE SORT-OF “CRAZY-  AND-MIXED-UP”.  With some fonts not available.  I have never had  much of this problem, and I’ve noticed updates to Open Office 3.2  seem to have fixed what issue there was, in just the last year-and-a-half, or so.  And most distros nowadays come with Libre Office, which is better and more advanced (or LO can be added to the system).  So this is just not much of an issue anymore—or so it seems.

LINUX IS KNOWN FOR BEING BACKWARD-COMPATIBLE WITH HARDWARE.  BUT IT IS OFTEN NOT BACKWARD-COMPATIBLE WITH SOFTWARE (EVEN ITS OWN NATIVE STUFF).
Ubuntu, for example, will often boot and run on some very old stuff–like a ThinkPad from 1999 with Pentium 2 and 128 Mb RAM.  But if you use it long enough for a couple of LTS release-cycles to have elapsed (say, six years), then some of the non-default apps you like might quit  working, and you may have to go with a different app to get the same  functionality.  And it might be very difficult to get exactly the same  functionality.  The situation (I guess unless you’re a * SLACKWARE * user) seems, well, “paradoxical”:  Linux is Open-Source, but it can be rather difficult (read * mighty * difficult * for a noob) to install a newer version of, say, your favorite media-player or some other app, in your current release of, say, Ubuntu—without upgrading the whole she-bang to the next (higher) release of the distro.  Unless of course it’s in your repos—which would probly mean it would just be the same version you already have, anyway.  This stodginess is there for a reason:  it’s to prevent us not-so-techy people from messing-up the dependencies libraries on our Linux-based system.  There are ways around it, of course—like compiling from source.  But you know, even as convenient as compiling most * nix stuff from source is nowadays, it’s still more hassle than the average WINDOWS user probly goes-through, just to get an updated version of an app.
I also have the sad duty to report that at least a few people have complained that their favorite app from 2 or 3 releases ago now no longer works in the newest versions of their favorite distro.  And they’re told not to install the old version (of the app) to the new release of their Linux.  Even so, with over 30,000 free programs from which to choose, I’d think a bloke would stand at least an even-chance of finding a similar one that wouldn’t upset his distro’s internals.
But as we know, Linux continues to develop.  We’ll see what the near future brings.

AN UPGRADE [TO THE NEXT INCARNATION (RELEASE) OF YOUR LINUX         DISTRO] IS ESSENTIALLY THE SAME AS A RE-FORMAT.
Doing a system upgrade in Linux can render it no longer functional.  [This can also occur in WINDOWS, by the way—and, as in Linux, for the same reason—i.e. we did not fully-enough understand what we were doing; OR, a WINDOWS re-install or system upgrade can sometimes go-awry for reasons that seem to defy explanation.  This latter is arguably less often the case with Linux.  At least where it comes to rudimentary elements of the process.]  But some people (not properly informed) point to this “upgrade-risk” as a deficiency of desktop Linux.  See the entry on “Installing Linux”.

APPENDIX A:  Frugal install of a re-mastered Ubuntu as a read-only operating system, to your harddrive, and other methods of live-booting, for privacy and security.
PLEASE UNDERSTAND, THAT IF YOU DECIDE TO UNDERTAKE ANY OF THE PROCEDURES BELOW—JUST AS WITH * ANY * OF THE LINUX PROCEDURES I HAVE DESCIBED IN THIS DOCUMENT OR ELSEWHERE—YOU ELECT TO UNDERTAKE IT * AT * YOUR * OWN * RISK *, AND ** YOU ** ARE LIABLE FOR ANY MESS YOU MIGHT GET YOURSELF INTO WITH YOUR HARDDRIVE.  Especially with regard to Method 3, which writes some code to Master Boot Record, unlike Methods 1 and 2.  * I * am a person who * likes * the “frugal-install” method, because I don’t like to lug-around thumb-keys, or external drives.  And I like the security/privacy advantages of the frugal (“P.M.I.”) type of install.  The first two Methods depicted are really pretty safe, and do not radically change your harddrive.  But some modicum of competence with computer hardware and software is still required of you.

A VERY IMPORTANT THING TO UNDERSTAND ABOUT THIS, is that we want to keep the programs and settings-changes wrtitten to the persistent-save area * to * a * minimum *.  WHICH IS WHY I USE REMASTERSYS.  There are also other programs available to re-master  customized Ubuntu and its variants.  THE MORE DATA FOR WHICH THE PERSISTENT-SAVE AREA BECOMES REOPNSIBLE, THE S-LO-W-E-R Ubuntu AND ITS VARIANTS TEND TO GET, WHEN UP AND RUNNING.  So if we can re-master our “tweaked” Linux distro, and then use the custom .iso for the purpose of creating the live-compressed/”frugal-install” to thumb-key or harddrive partition that we later intend to make persistent, then getting things to run at normal speed becomes less of a problem.
You ought to understand also, that there is USB 1, USB 2, &tc.  There can be compatibility problems with this, where live booting of Linux is concerned:  A bootable Linux USB-2 thumb can take a * long * time * to boot, * where * you * are * trying * to * boot * the * sucker * on an old computer, whose socket is only USB-1—though this often isn’t labeled on the computer’s case—or the label has been lost/scratched-off.  You may be able to discern what version of USB is installed by running Linux’s “Hard Info”, or trying to find a hardware manual or other documentation for the machine online.

Method 1:  Use USB thumb-keys to create a frugal/P.M.I. install of your favorite desktop Linux distro to a FAT 32 partition on your harddrive, without touching your MBR (Master Boot Record):
PLEASE NOTE that this procedure is at least * somewhat * of a risk, as it alters boot-loader configuration.
CONDENSATION for Method 1:
1. The first thing I advise is to create a desktop Linux full-install thumb-key, as described in Method 1.  This will allow us to add the programs we may want, but which our Linux distro may not have come with, as a download—and we will be able to do so without having to copy these new softwares  to the persistent-save partition later-on, because doing it this way—though easier—tends to S-L-O-W Linux down considerably:  we will add the desired programs to the temporary full-install USB thumb-key, and then the settings and customizations we want.  We will then install the GUI program RemasterSys (or use another utility for the purpose of re-mastering), and run this, creating a new, custom .iso of our Linux distro:  and we will then use * that * from Unetbootin or  Universal USB Creator from Pendrivelinux.com (or another program for this purpose), in order to “frugally install” our custom .iso to (another temporary) USB thumb-key, and ** this ** will then allow us to do our final step—which will be to “frugally install” Ubuntu, Fedora, or a variant, to a FAT 32 partition in our harddrive.  If we want a persistent-save partition, this can be ceated and activated later-on, after we have gotten our frugal-install working.
2. Why so many steps?  Doing it the way I have described allows you to do all this from easy GUI programs (and free-of-charge, too!), which will do all the work * for * you, “automagically”—and most importantly—* without * touching * your * Master Boot Record * (MBR) * in * your harddrive *.
And ** this ** is the user to whom Method 2 is directed:  people who wish to have a frugal-type install of Ubuntu to their FAT 32 partition in their harddrive, so they don’t have to lug that stupid USB thumb-key around with them, and so that everything will be easy to un-do, should they decide to un-do it in the future, without having to run Windows’ Repair Disk and go to a command-line to restore the computer’s MBR.   One ** can ** just accomplish this the “normal” way, using Unetbootin to do the frugal-install to one’s FAT 32 partition on their harddrive, the FAT 32 having been created ahead of time with Windows’ Disk Manager.  (Or I guess you could use G-Parted if you like—but since Win 7 came out, I like Windows Disk Manager for the purposes described in this entry).  * But * you * should * know * that doing this the “normal way”,  means that the Master Boot Record of your harddrive will be over-written with Syslinux bootloader’s mbr-code—so not only will your machine boot the frugal-install by default—but you will probly have to learn how to chain-loader with something like Grub4dos if you want to be able to boot Windows Vista or 7 (not that difficult, really), as Syslinux seems to have difficulty booting Vista/7 directly.  Syslinux has a much better relationship with Windows 98, 2000, and XP.  Microsoft has changed the bootstrapping-process somewhat in Vista & + up.  Look at Method 3, below, for the “normal” way.  But frankly, I prefer not to mess with Windows Vista/7’s MBR period, because it can sometimes be difficult for a (relatively) new user to un-do, if the person decides that they want MBR back to its original condition.  So I like my “weird way”.
Another thing you should know is that NOT ALL VERSIONS OF Unetbootin ARE THE SAME—OR EVEN MUCH SIMILAR.  Yes, the interfaces are pretty similar, where it comes to running Unetbootin from, say, a newer version of Linux vs. an older release, and/or running Unetbootin from ms Windows.  But this is where the similarities end:  Unetbootin will have different features and capabilities, depending on what operating-system you’re trying to use it from.
If, for example, you are trying to use Unetbootin to do a frugal-install of Ubuntu to your machine’s harddrive, I recommend running it from Ubuntu 10.04.  And of course, this is assuming that you have already partitioned your harddrive the way you want it, having already created a FAT 32 partition.  This type of partition is excellent to have in the harddrive of * any * computer that there is, as it is wonderful for data-backup, storage of personal-files, the creation of a separate “home”-partition that can interface with Microsoft Windows (2000, XP, 7, & yada yada), ** and ** it can also be employed to hold and boot a “frugal-type” install of desktop Linux.  To-day, you can make this as a NTFS partition, if you want!  I’ve done it, and it performs exactly the same, and for * all * the purposes I’ve just listed—including frugal-booting.
EXPANSION for Method 1:
1. I would recommend using Windows 7’s Disk Manager to re-size ms Windows, being careful not to shrink Windows to too small a size:  don’t make it too small, or it won’t boot—and leave enough room for updates—and them add a little room to that.  Windows is about twice as space-hungry as Ubuntu/desktop Linux.  Then create a FAT 32 or NTFS partition out of the remaining “unallocated” space.  There are plenty of instructions on how to do all this, available for free, online.  I RECOMMEND RE-BOOTING INTO WINDOWS, AS SOON AS THESE TWO THINGS ARE DONE.  Windows is programmed to “keep-track” of its partition-size.  Change the size of Windows’ own partition more than once between re-boots, and you’ll probly have to do a full system-recovery/re-format.  Windows should probly recognize the new partition as “E:\” —unless you’re using a netbook, which has no cd-drive, in which case it will probly be seen as “D:\”.  BUT NOT NECESSARILY.  Depends, on whether your machine has more than one internal harddrive, and other hardware variables.
2. Now, you can download different Linux desktop distros and releases of distros, and make these into bootable DVDs or USB thumb-keys, as I have described elsewhere—or per instructions freely available online.  Find the Linux you like, and which “likes” your hardware (see my recommendations in the main-body).
3. When you have determined your Linux distro of choice, install it to a USB thunb-key as a full-install.  You do this with a live cd/DVD or live thumb-key, just like installing to a harddrive:  but you “point” the installer-program (Ubiquity) at a “target” USB thumb-key.  Use care in understanding what location you are pointing-to.  If you realize you’ve made a mistake, it is still often better to just let the installer go ahead and do its thing, and complete, than using the “abort” or “cancel” feature:  doing the latter often results in corrupted sectors, and a damaged hardddrive.
4. This full-install to another USB thumb-key will allow us to “tweak” Ubuntu—to customize it with the extra software we might want, and save settings and personalizations.
5. Once we have customized our desktop Linux distro, let’s install RemasterSys, or our favorite re-mastering software.  Running this will give us a custom system-image, in the form of a new .iso, in a folder created in our Linux system that’s being re-mastered.  The only tedium is waiting for it to complete (it will take a * while *, and you aren’t supposed to be using the computer for anything else, while RemasterSys is running).
6. After we have our new .iso (see Method 1, Expansion, step 1, ca. paragraph 16 ), let’s then create a folder inside our new FAT 32 harddisk partition, and copy the .iso into there.  If you are having trouble finding the new .iso, see the section to which I refer you.  Right-click the new .iso, select “copy” with the left mouse-button.  Open a new instance of Nautilus files-manager (or Thunar, ~ whatever one you’re using), and browse into the FAT 32 partition on your harddrive.  Create a folder in there—go to file > create new folder. Name it “My_iso_backups”.  Or “Mary_jane”.  NOTE I did not type a period or “dot” in the new folder’s name.  I connected words with an underscore, and I began its name with a capital letter—but kept all the rest of the letters as lower-case.  Now, use ctrl + v, and the custom .iso should be copied into the new folder.
7. The next order of business would be to use our new custom .iso to make a bootable thumb-key.  This is going to act as a “stand-in” for our FAT 32 partition, and we can re-format the thumb-key later, and again use it for other purposes.  The USB thumb-key we’re gonna use as our “target”—our install of our new .iso image—should probly be at least 8 Gb in size, although it is often possible to use a 4 Gb. thumb-key for this.
8. If you want to add persistence to this scenario, you can do that as an afterthought.  Create a partition of about 3 or 4 Gb. on your harddrive (or on a USB thumb-key).  Format it to some Linux volume-format (.ext2, .ext3, .ext4, Reiser, btrfs, et al).  I like .ext2, because you then have the option to encrypt the partition later, if you feel like it.  Name the new partition “persistr”.  Or “maryjane”.  Close G-Parted.  Now open Terminal:  use the e2label command to set the correct partition name for a Linux persistence partition.  The correct name is “casper-rw”.  Then, the next time you boot your live/frugal Linux, type “persistent” at the boot-prompt.  This will load Ubuntu in “persistence mode”.  See below—Method 2, Condensation, step 6.
9. We need to understand that we are intending to install Ubuntu or a variant to the thumb-key—but as a * compressed * install.  What that means, is that the thumb has to have room for a file-system big enough to hold the Linux distro’s size as an .iso file.  PLUS we should give it a little extra room—say, on the order of perhaps 150 Mb. or so—because we’re also gonna get the Syslinux bootloader onto the thumb key, which is good for being able to start .iso-type installs.
If the iso image of Ubuntu 12.04 happens to be, say, 760Mb., then let’s settle on at least about 1 Gb. for our thumb-stick.  Because 760 Mb. is about ¾ of a Gb. And like I said, we oughtta give it a little extra room.  If we’re gonna use Linux Mint 13 XFCE Edition, or the Cinnamon Edition, well, the iso of either of these is about 1.1 Gb., if my memory serves me correctly.  So a thumb-key of 1 Gb. in that event won’t be adequate.  We’ll have to make it say, at least 1.5 Gb.  I’d think that should do it. (Like Grandpa always said:  “You can put 5 lb. of potatoes in a 10 lb bag, but nobody can put 10 lb of potatoes in a 5 lb bag.”)
10. The next thing to do would be to use Unetbootin to create a “live, bootable .iso-type USB thumb-key” on our temporary target thumb.  I prefer donin’ this with the version of Unetbootin you get if using Ubuntu 10.04.  It will work * in * theory * from pretty much any Linux, or for that matter Windows Vista or 7. But Ubuntu 10.04 is my choice, where I am going to use Unetbootin for the purpose described right here.  Not to worry, though, you can just burn a cd, and run it from that, downloading Unetbootin into the computer’s memory (Unetbootin is small), and just do the procedure that way, if you don’t have another way to boot Ubuntu 10.04.  Or I guess you could just try from another os—it shouldn’t make a difference—** in ** theory **, I am saying.
11. Use Unetbootin to do the “install” of your custom .iso, to the target thumb-key.  When you have accomplished this, shutdown the computer, and boot back-up, this time using your BIOS to boot the targeted thumb.  Check-out whether it works satisfactorily.  If so, then move to the next step.
NOTE that you might need to launch Unetbootin with root-privilege, with something like Alt + F2, type “gksu unetbootin” & Enter.  Or it might work just like it is.  OR—as we’re probably doing this from a Linux environment we intend to over-format later/aren’t worried about as far as security (a live Ubuntu 10.04 cd or our full-install thumb-key), we could jail-break the root, and do the procedure from that.  It is easy to just use the passwd command to set a real Root password for the full-install USB thumb (or any full install of Ubuntu).  Yes, this will compromise security—* in * theory *.  But if you know how to set a truly hard password (and how to not lose/forget it), I really don’t find that it makes a difference to Ubuntu’s security:  it’s aruably more of a danger that you’ll be in Root, and accidentally edit a system-file, requiring Ubuntu to be re-installed.  But I really don’t think you’ll do that if you are competent enough with a computer to have made it to this point in these instructions.
NOTE that most USB creators come with a “files browser-button”.  If you’re not clear on what that is, its a very * inconspicuous * little thing that looks like this:  […] .   Three little dots inside a grey box.  It’s easy to overlook.  But a click on it, when you’ve reached the step in live USB creation where you need to select your custom .iso file, will lead you to that file.  Sometimes, though, this won’t work, and you’ll need to get understanding of the path to the .iso file, and then type that into Universal USB Creator manually.  It is not hard to learn how to do this with the file-manager in Linux or in WINDOWS (properly called “Windows’ Explorer”—not to be confused with * Internet Explorer *—which is the usual web-browser).  A little Googling will help you. This was another reason for having copied our custom .iso to a folder in our FAT partition:  it usually makes it easier to navigate-to, because you might end-up using Unetbootin from ms Windows, if you can’t make things work from Ubuntu 10.04 or other Linux—and Windows, of course, cannot read Linux folders, by default.
12. Once we have successfully created the bootable USB thumb of our custom .iso, and we have tested it to make sure it works well, we come to the tricky part—but this isn’t so tricky.
13. What I do is boot into another Linux system—like for example my full-install thumb-key, or a live cd/DVD.  Then I use Nautilus to navigate to the custom thumb—and open it.  What you see should look something like this:

[This screenshot is of a Sandisk USB bootable thumb of Ubuntu 10.04, created with Unetbootin, and Unetbootin was run from a harddrive install of Ubuntu 10.04.  I installed Unetbootin to the system from Software Center. I later made this screenshot while in another of my Linux systems (Linux Mint 13), having opened the mounted thumb-key with Nauilus.]

[screenshot A:1, below:]

scrnshtA-1-for-newbie-get-started

14. Next, just launch a new instance of Nautilus/your file-manager.  Now navigate to your harddrive’s FAT 32 partition—you ought to be able to click that in Nautilus’ left-view-pane.  Now that it’s open, you might see whatever personal files and folders you have put in there.  And that folder named “Mary_jane”.  All this stuff can be left alone, just as it is.  All you really need to do is start copying the files and folders you see in your custom thumb, to your FAT partition.  That’s it!  Just copy everything over, close Nautilus, and shutdown the computer.  You can probly even use drag-and-drop.  Or else just Right-click on each file or folder, and select “copy” in the pop-up menu: then, move the mouse-pointer back to your FAT 32 partition window, click in there in a neutral space, and use ctrl + v to let it copy.
The FAT 32 partition should now be able to boot your custom Linux, so you don’t need that thumb-key for this anymore (but hang onto it in its present condition for awhile, just in case).  And it will not hurt to add other files and folders to the FAT, once this step has been completed—just don’t mess-up the ones we copied-over:  leave them alone, until-and-unless you know what you’re doing, with respect to working on them.
All we need to do now is understand how to use a boot-loader to boot our newly bootable FAT partition, as we have not allowed any bootloader’s “stage-1” to copy to our MBR.  But this will not be very tough.
15. Of course, we do not have Ubuntu or other Linux installed to the harddrive. So this precludes use of GNU-Grub to boot our FAT (difficult anyway).  And as we only copied files to the FAT partition—and no code was written to our MBR—we cannot use Syslinux directly to boot our new creation—though Syslinux’s main parts are in our FAT partition.  What to do?
16. My preference for this scenario is to boot Windows 7, and download and install EasyBCD.  This will allow you to hand-off the boot process to another boot-loader (other than Windows’ own), * but * without * writing * to * your * Master * Boot * Record *.  Which is the whole point of this exercise—the “normal” way, as I’ve said, is to just use Unetbootin to install the custom .iso as a “frugal”—but this alters MBR, and some of us don’t like this.  Doing it as I’ve described will only alter the MBR of our “temporary target”—the thumb-key, and that’s cheaper to replace that than your harddrive, if you later decide you want your MBR back, and can’t get it to format again for some reason.  There are plenty of instructions on EasyBCD online, freely available.  NOTE that it is important to make a backup-copy of Win 7’s BCD folder, and bury it in another folder somewhere in your harddrive just to be safe.  (And preferably some place such that you can remember where it is.)  Now all we need is a bootloader for EasyBCD to “find”, on our harddrive.
17. This should be easy.  Just go to
http://www.icpug.org.uk/national/linnwin/step1-9x.htm
and download a small file called grub.exe.
Another link might be
http://download.gna.org/grub4dos/
where I am told you can d/l grub.exe inside a zip file, and then extract it. The grub.exe is just one of the two main versions of Grub4dos:  the other is “grldr” (NOT “grldr.mbr”):  but I prefer grub.exe, because it’s semi-graphical, * but * you can use its command-line if you want, by hitting the “c” key at pretty much any time.   Grub.exe is incapable of writing to MBR by itself:  it would need an additional file, like for example the file “grldr.mbr” (not to be confused with simply “grldr”).  Just installing the file “grldr” basically renders to you an instance of Grub4dos command-line only version.  The file grub.exe is grldr plus a limited GUI type booter environment.
18. Now, open your Downloads folder, and, with a Right-click on grub.exe, select “copy”.  Now just paste the grub.exe onto your FAT partition!  Just click in neutral space in your FAT 32 partition, and ctrl + v.  DO NOT make a folder, and put grub.exe in there.  Copy it like I said.  It won’t hurt a thing.
19. Once you’re done, all you have to do is set EasyBCD to be able to hand-off to grub.exe at boot-time.  You can then use grub.exe to boot anything on your disk that’s bootable—including your FAT partition, since we have done all the above stuff to make it bootable.  There are plenty of instructions for Grub4dos available online.  Really, if you just use grub.exe’s command-line, it should be easy to boot anything.  That way, you don’t have to alter any config files.  When you see the grub’s menu_lst screen, you can hit the “c” key on your keyboard, to get the Grub4dos command-line.  The “chainloader + 1” command should allow you to boot any bootable partition on your disk.

Method 2:  Install Ubuntu (or a variant) as a live/compressed install to a partitioned USB thumb-key, containing a persistent-save partition you create beforehand.  This will not touch your harddrive or your harddrive’s MBR (Master Boot Record), and it will be the equivalent of booting a custom Linux .iso from a DVD; but you will be able to save changes to settings and files, and the system WILL RUN AT FULL-SPEED.  NOTE that everything I describe in this method is accomplished with GUI, point-and-click free programs, so there isn’t any command-line stuff for you to wrangle with (except for one brief step, late in the game, which involves just one * small * command-line command, which a child could do).
CONDENSATION/WALK-THROUGH for Method 2:
1. Using either WINDOWS or else booting into a Linux environment, you can use either Windows Disk Manager (if you booted into WINDOWS 7), or G-Parted (from Linux), to point to the USB thumb-key you already had inserted into the side of your computer, and, selecting the big FAT 32 partition in the thumb, shrink it down from the RIGHT-end, as it’s depicted/represented in either Disk Manager or G-Parted.  Move the mouse-pointer over the Right-end of the big FAT 32 partition, and, when the double-arrows appear (in G-Parted), push-down the left mouse-button, and, WHILE HOLDING IT DOWN, drag the partition to the Left  <<<  , causing an order to be placed for it to be “shrunk”.  Then, you’ll have to click something like “Edit” in Disk Manager or G-Parted’s toolbar, and then click “apply”, to execute the order:  this gives you plenty of opportunity to back-out, and just close the program, if all does not look right to you.  And that’s what you should do:  if it doesn’t seem like you’ve set-up the “pending operation” right, then just select “cancel”, and try again, or just close the program.  BE BLOODY SURE YOU’VE POINTED Disk Manager OR G-Parted AT THE CORRECT DEVICE (THE USB THUMB-KEY) AS YOUR TARGET—* NOT * YOUR BLOODY HARDDRIVE.
2. Once we have re-sized the FAT 32, we now select the “hole” that’s left—the “unallocated space”; and we will turn this into our Linux persistent-save partition.  THIS WILL PROBLY HAVE TO BE DONE FROM G-Parted OR ANOTHER Linux TOOL, as WINDOWS’ Disk Manager cannot format-to Linux/*nix-type disk-formats, because Disk Manager is only able to deal with WINDOWS’ stuff.
3. Now, we need to do one simple operation from a Terminal in Linux, to give the persistent-save partition a label that the Linux desktop system will recognize, as G-Parted seems only to be able to assign volume-labels at a higher-level, and that won’t be adequate for this purpose.
4. Once we have come this far, we then just need to do our .iso-type install (“live-type install”) of desktop Linux to the FAT 32 partition that we shrank-down.  Then we’ll be done!!
5. When all done, we BIOS-boot our new USB thumb-key, letting it alone while booting, so it will boot with its defaults, so that we can test whether the .iso install worked right, and that the desktop is fully working.
6. If all is well, we shut it down, then wait a half minute or so, and again BIOS-boot our new creation, THIS TIME ENTERING “persistent” AT THE BOOT-PROMPT (without the quotes, of course), & hit Enter.  You will only have a second or two to start typing this—when boot-up reaches that point; but once you’ve typed the first letter ( p ), this “freezes” the clock, and you can take as long as you need to enter the rest:  boot will resume, upon hitting Enter.  This use of a “boot-argument” is needed to tell Linux to look around the disk for its persistent-save partition, or else it will probably just boot into its live-only mode (default).

EXPANSION for Method 2:

1. Step 1:  The first thing we oughtta have is an install of desktop Linux.  But this doesn’t have to be to our harddrive.  It can be to a USB thumb-key, like the one we’re gonna use to hold our finished-product—a live install of modern desktop Linux, plus a separate partition in the thumb-key to hold persistent-save data, so that we can preserve our settings and changes to files from one re-boot to the next.  Plain iso-type installs of desktop Linux to a USB key, like you make with Unetbootin, or Universal USB Creator from Pendrivelinux.com, or other free tools, well, these * will ** not * save such data by default—although I think these is a slider in at least one of them you can set, that will config persistence of some sort—but it seems to say that it “works for Ubuntu ONLY”.  It’s worth understanding as well, while we’re at it, that your experience of the two creator-programs I just named (if not others as well) * is * different *, ** depending ** on ** what ** operating ** system ** you’re ** running ** the ** tool ** from **.  So, if you fire-up Ubuntu 10.04, install Unetbootin, and then decide to use it to install a custom .iso of Linux you’ve created ahead of time to a USB thumb-key, then you’ll often find different options available for you, than if you fired-up Linux Mint, or ms WINDOWS, and installed Unetbootin to one of those (although we don’t exactly “install” it to WINDOWS), and you tried to do the same thing.
Understand also, that you may be required to open Unetbootin or other such program with a ** root ** privilege, if trying to use it from Linux, in order for It to complete.  This can probly be accomplished by just holding-down Alt, and hitting F2 once.  When the run-progs dialog appears, type “gksu unetbootin”, and hit Enter.  That should launch the GUI app as root—though you’ll probly have to type-in your password to the authentication dialog box.  Frankly, I’ve always made it work by just jail-breaking the Root account in my Linux installs, so I can run as root just for a few minutes, while I create the bootable thumb-key.  But I don’t recommend this way of doing:  the recommended way is to use “gksu”.
But anyway, if you don’t wanna touch your harddive at all, then the option is to first do a full-install of our Linux distro to another drive—like an external harddrive, or a USB thumb-key.  We can even use the same USB thumb-key we’re gonna eventually use to host our finished-product—a live/.iso-type install of desktop Linux to USB thumb, with a persistent partition in the same thumb:  but I don’t like to do this way.  I prefer to have at least two thumb-keys, because if you forget something, or something goes awry, it is much easier to start-over again, with one thumb that’s dedicated just to the full install of the Linux distro.
So why do we need a full install?  Because, dude, the * more *  * stuff * for which you make the persistent responsible, the s-l-o-w-e-r Ubuntu is gonna get.  Till it becomes unusable.  The wise way to deal with this, is to A) use a re-mastered Linux .iso you’ve created ahead of time with some utility like RemasterSys (easy):  this way, most apps you’d want to add can already be part of the .iso image—and not have to go into the persistent-save partition later, slowing Ubuntu or its variant down, and defeating the purpose of what we’re donin’ here; and, B), just use the persistent-save partition for what RemasterSys wouldn’t copy-over, like Dropbox, and certain settings/personalization.
So let’s install modern desktop Linux to a USB thumb-key first, as a full-install (easy); then, we’ll “tweak” it (I like to say “cook” it), adding the programs we want, but which Linux Mint didn’t come with, by default.  (It will be necessary to run the Update Manager first.)  And then adjust our customization/personalization settings the way we want them, and then we’ll download and install RemasterSys (or your favorite re-mastering utility), then run * that *, and we’ll use the resulting new .iso image (which RemasterSys puts in a folder that it “automagically” creates for you) to do our thumb-key install with Unetbootin (or your favorite live Linux USB-creator app).  So how to do this?
There are a couple of ways.  One way is to actually just start from WINDOWS (XP or 7), and download Linux distros, burning them to cd/DVD s [as I have described in the above (main) body of text], until you find one you like, and which recognizes all your hardware—sound and graphics cards, networking and wifi-cards, printer, web-cams, &tc.—and then use this live DVD to install the Linux distro to a USB thumb-key, using the Ubiquity installer that comes with Ubuntu and its variants (or the Anaconda installer, if you’re going with an RPM-based distro, like PCLinuxOS, Fedora, Mandriva, et al).
The other way would be to just use a * third * USB thumb-key, and keep trying different desktop Linux distros on it, creating one bootable thumb-image on it after another—doing this from ms WINDOWS with Unetbootin or other prog—until you find a distro you like, and which likes your hardware.  A better way, baby, would frankly be to install some “container-virtualization software” to WINDOWS (free—VMWare and VirtualBox are two popular ones), and then experiment with different Linux distros, until you find one that’s right for you.  This “virtualization software” is very safe, does not touch your harddrive, and is very easy to un-install—even for a novice-user.  There are plenty of “container images” of Linux available for these * virtual environments *, free of charge.  You could even just stop there, and try to configure persistence for a containerized-image (of Linux).
The downside is that A) I don’t know how you could re-master a container-image:  they seem to come fixed, just like they are; so if you want additional apps, they’d have to go into the persistent-save folder, and my research indicates that this is gonna make Linux s-l–o—w.  And B), this way of using Linux is reputed to sometimes give trouble, where it comes to being able to access external drives, thumb-keys, and devices, and perhaps even getting wifi and/or networking to work.  But again, as with all things Linux desktop—it * depends * on your exact hardware, and how well that supports Linux.  I have talked about this at some length, in the above main body of this text.
ONE THING IS REAL IMPORTANT HERE, where it comes to this business of installing desktop Linux to our first USB-key, as a * full * * install *.
You need to be very careful that you are installing Linux to the bloody USB-key, dammit, and NOT to your bloody harddrive.  OR you’ll over-write WINDOWS.  The Ubiquity installer should do the rest * for * * you * just fine. (For those wanting to use an RPM distro’s Anaconda installer, I’m sorry, but I don’t have any experience with it at the time of this writing.)  Anyway, just understanding that “sda” and/or “hda” is almost certainly gonna be the harddrive * inside * your computer (and therefore the location of your C:\ drive), is key.  REMEMBER, too, that some computers have two or more harddrives inside, so watch yourself.  Refer to what I have said in the main-body, above, as to learning about your machine’s hardware, before you mess with Linux.  This is not difficult to do.
Once we have created this “full install thumb-key”, we will use that to “cook” our favorite Linux desktop distro to the condition we want, and then create our new .iso image.
Using our full install thumb-key, we can first run the Update Manager, being sure to download and install all the updates we can.  This is standard Linux orthodoxy, where one is about to install any new software.  And we would probably want our new .iso image to be up-to-date.  Next, let’s adjust our personal settings the way we want—setting screen-saver, GUI behavior, and other policies.  Then, add any softwares which we think we will use.
When we’ve accomplished this, we can download and install RemasterSys.  At the time of this writing, this program exists outside official Ubuntu Repositories—but I’ll say I’ve yet to have a problem with it.  Just Google it—search something like “how do I install RemasterSys to…. ”  and add the name and release of your Linux distro.  RemasterSys is also available as a .deb file.  I have installed this to one of my distros in a full-install partition, and it seems to be fine.  But yeah, .deb in Linux is the rough equivalent of .exe files in Windows—so you might want to take the usual security procedures.  Anyway, the link for the .deb would be:
http://ubuntuforums.org/showthread.php?p=10923537
Or you could try another re-mastering program, I suppose.
Close all open windows, and run RemasterSys (it’s GUI).  This will take a * while *, and watching its progress will be as entertaining as watching paint dry.  But that’s how I prefer to do it.  I think the program also logs its progress, so I guess you could go and look at its log-files, if you decide to go do something else while it’s running, and it completes with an error.  But I’ve never had this happen.
Once RemasterSys has  done its thing, you will be able to just open your Home folder, and look for a new folder called “Remastersys”.  Your new .iso of your customized Linux os will be in that folder.  Can’t find it?  You need to be sure you’re going all the way to the TOP of your folder tree, and work down.  You’ll soon stumble upon the new folder.  I’ll use the Nautilus File Manager here to illustrate my point, though your experience in this context will probly be about the same, for most of the popular Linux Files-Managers.
It might seem a little confusing, but in most desktop Linux distros, there is “Home”, and there is “home”.  And the one with the capital letter is actually * secondary *.  So “Home” is ** beneath ** “home”.  The only thing above “home” is “root file system”.  You oughtta be able to find the RemasterSys folder by just going to “home”.  Nautilus (and I think most of the rest) open in “Home”, with the big H, by default; pay attention to the teeny, tiny arrows in the top of the Nautilus interface.  One that’s pointed to the Left, will usually take you to a higher level of the file-system.
Once we’ve found our custom .iso, we can re-name it, if we want.  Take care, as to choosing a new name, if you decide to do this.  Don’t make it too long; don’t make it so short, that you won’t be able to tell what it is, in the future, because you may have it around for a long while.  Don’t make the new name too complicated, with special symbols, or capital letters, or the like.  And use the underscore (“ _ “) to separate words.  As:  My_custom_ubuntu_1104.   NOTE that I did not type “ubuntu_11.04”:  I omitted the period.
So far so good.  What I prefer to do next is to make a backup copy of this custom .iso, to my WINDOWS system.  This is easy.  Right click on our new .iso file, and select “copy”.  Now, open a new instance of Nautilus.  You can just do Alt + F2, and type “nautilus” (all lower-case), and hit Enter (this won’t work in Ubuntu 11.10, though; but you can open Nautilus from menus—however Alt + F2 works in pretty much everything else).  Now use the Left view-pane, to find your WINDOWS files.  This could take a few tries, if you’re a novice—and be careful:  we don’t want to get into WINDOWS actual system-files (the “Registry”).  Find your WINDOWS “My Documents”.  Now use ctrl + v, to copy our new .iso into that.  You could copy it to “Downloads”, or “Pictures”, or another folder, if you wanted.  This will not only back-up our new .iso, so that we have a back-up copy, but it will also facilitate the creation of our bootable custom USB thumb-key.  We might, for example, want to use a bootable thumb-creator that only works from WINDOWS, if we like that particular program.  That’s how I often do it.  I like to use Universal USB Creator from pendrivelinux.com.  This is a mouthful, but it’s a pretty good program.
NOTE that most USB creators come with a “files browser-button”.  If you’re not clear on what that is, its a very * inconspicuous * little thing that looks like this:  […] .   Three little dots inside a grey box.  It’s easy to overlook.  But a click on it, when you’ve reached the step in live USB creation where you need to select your custom .iso file, will lead you to that file.  Sometimes, though, this won’t work, and you’ll need to get understanding of the path to the .iso file, and then type that into Universal USB Creator manually.  It is not hard to learn how to do this with the file-manager in WINDOWS (properly called “Windows’ Explorer”—not to be confused with * Internet Explorer *—which is the usual web-browser).  A little Googling will help you.
NOTE that you should have the USB-key you are intending to use already plugged-in to the computer, and already partitioned, before starting your USB-creator.  Read on.
2. Step 2:  The USB thumb-key we’re gonna use as our “target” for the “finished product”—our install of our new .iso image—should probly be at least 8 Gb in size, although it is often possible to use a 4 Gb. thumb-key.  Remember that we need to create * two * partitions—one for Ubuntu/an Ubuntu-variant, like Linux Mint:  and one for our persistence.  NOTE that it is just as possible to use * two * thumb-keys instead—one for the Linux distro (Ubuntu or a variant), and another USB thumb-key just for the persistence.
Let’s try to understand that, when we partition a thumb-key so that it will then have more than one partition, we will find that, when we boot into, say, some install of Linux or maybe MAC OS/X from our harddrive, we will find that it is then seen as * two * USB thumb-keys.  Each partition will be recognized as a separate thumb-key.  This is what we * want *.
We need to understand that we are intending to install Ubuntu or a variant to the first partition—but, in the context of this article—as a * compressed * install.  What that means, is that we cannot shrink the thumb’s FAT 32 partition to a size smaller than the size of the Linux distro’s size as an .iso file, PLUS we should give it a little extra room—say, on the order of perhaps 150 Mb. or so—because we’re also gonna get the Syslinux bootloader onto the thumb key, which is good for being able to start .iso-type installs.
If the iso image of Ubuntu 12.04 happens to be 760Mb., then let’s settle on about 1 Gb. for the size of our first partiton for our thumb-stick.  Because 760 Mb. is about ¾ of a Gb.  And like I said, we oughtta give it a little extra room.  If we’re gonna use Linux Mint 13 XFCE Edition, or the Cinnamon Edition, well, the iso of either of these is about 1.1 Gb., if my memory serves me correctly.  So a partiton of 1 Gb. in that event, won’t be adequate.  We’ll have to make it say, 1.5 Gb.  I’d think that should do it.
Like many people, I’m a Dropbox user, and I want to be able to have this functionality available in my live-boot of desktop Linux.  So I tend to try to reserve just enough space in the first partition to hold the iso-install of Linux, and then use about, say, 3 Gb. size for the persistent—because Dropbox gives you 2 Gb. of free space on their servers.  More than that, and you have to start paying.  But that’s a bit of a moot point here.  Why?  Because, dude, the * more *  * stuff * for which you make the persistent responsible, the s-l-o-w-e-r Ubuntu is gonna get.  Till it becomes unusable.  The wise way to deal with this, is to A) use a re-mastered Linux .iso you’ve created ahead of time with some utility like RemasterSys (easy):  this way, most apps you’d want to add can already be part of the .iso image—and not have to go into the persistent-save partition later, slowing Ubuntu or its variant down, and defeating the purpose of what we’re donin’ here; and, B), just use the persistent-save partition for what RemasterSys wouldn’t copy-over, like Dropbox, and certain settings/personalization.
3. Step 3:  Let’s finish partitioning our USB thumb-key, and then use it to create our bootable custom thumb.  So while we’re in Linux (perhaps running from that full-install thumb we made), let’s plug-in our “target key”.  Plug it into the computer’s extra USB port.  Or you can use one of those multi-port hubs, if you don’t have enough USB ports available.  Let the system recognize it.  Now, let’s go to our System menu (or use Alt + F2), and start G-Parted.  Let’s use G-Parted to partition the USB key as I described above:  shrink the lone FAT 32 partition to a size a little larger than our custom .iso.  Next, click on the hole we left (the “unallocated space”), and let’s select “Partition” from G-Parted’s toolbar, select “New”, and make a new partition in here, of about 3 or 4 Gb.  This will become our persistent-save partition.  If  G-Parted wants you to “unmount” the USB key to work on it, go ahead and unmount it.  I like to use .ext2 for the format, because I might want to encrypt the persistent partition someday.  If you think there’s even an outside chance you might want to encrypt your persistent-save partition someday, .ext2 is your best bet.  Otherwise, you could use .ext3 or .ext4.  Even ReiserFS or btrfs would probly work.  Name the persistent-save partition “persistr”.  This is NOT the correct name:  but we’re gonna have to tackle * that * from Terminal, with a simple command, as the persistence software in Ubuntu does not recognize  G-Parted’s naming.
4. Step 4:  Now that we have finished partitioning the “target key”, let’s close  G-Parted, and open a Terminal window.  Before closing G-Parted, let’s make a note of the command-line designation of the “persistr”:  is it identified as “sda7”, “sda10”, or what??  This is important, as we will need this info to proceed.  Once in Terminal, let’s use the     e2label    command to re-name our “persistr” to the name that needs to be recognized, at a lower-level.  So,
sudo e2label /dev/sda7 casper-rw
This must be typed with * exactitude *.  AND YOU SHOULD BE SURE YOU UNDERSTAND THE “ADDRESS” OF THE “persistr” PARTITION FIRST:  is it located at sda7?  sda8?  Or what?  Re-open G-Parted and check, if you’re not sure.
Remember that you will be asked for your password in Terminal, and that Terminal will not display (“echo”) your keystrokes as you are typing a password—not even represented as dots.  So you just kinda gotta mentally keep track.  You’ll be able to try any number of times, if you fail.  If you get disgusted, you can just close Terminal, and try later.  AND BY THE WAY, the e2label util DOES NOT give you any indication that it completed successfully:  indeed, it takes it only a second to re-name a partition, and then you just see your command-prompt again, with the hostname and your login nick-name shown, ready for another command.  Once this has been achieved, we can check it:  we can close Terminal, and then launch G-Parted again.  Use  G-Parted to have another look at our target-key:  it should show that “persistr” is no longer in there, and it has changed to “casper-rw”.
5. Step 5:  Once we are this far, let’s just shut-down the computer, with our USB drives still plugged-in.  This is always the safest way to remove USB drives—no matter what os you’re using.   Once it is shut-down, carefully remove the USB that is the * full * Linux install, and leave the “target” drive plugged-in.
6. Step 6:  Now boot the computer back-up, booting into WINDOWS (this is how * I * do it, so I can use Universal USB Creator from Pendrivelinux.com, which won’t run in Linux:  you could use Unetbootin or another program from Linux, if you wanted).  NOTE that our “target key” is now recognized as two (2) separate devices, and that WINDOWS can’t tell us the nature of the second device (the “casper-rw”).  We will target the first device/the one that WINDOWS can fully recognize as the destination for our compressed, live install of our custom .iso:  this is the part of our thumb-key that we left formatted to FAT 32.  The old FAT 32 disk-format is the one Linux uses, to run as a compressed, live files-system, when booting off a thumb-key (or a harddrive—in the case of a “frugal install”).  The “casper” will handle our persistence, when we finally boot our custom iso-type Linux with the intervention of the “persistent” command, at boot-time.  You will only have a second or two to start typing this—when boot-up reaches that point; but once you’ve typed the first letter (p), this “freezes” the clock, and you can take as long as you need to enter the rest:  boot will resume, upon hitting Enter.
Like I said before, you can use the “files browser button” in a shell such as Universal USB Creator from Pendrivelinux.com, to try to enter your custom .iso file to have this installed to your USB stick, and have the USB-creator make this bootable for you.  But you might have to find the correct path of your custom .iso file with WINDOWS’ files-manager, * because * it’s a custom .iso—not a stock one that could’ve been downloaded from the web.  It might do well to practice a couple of times first, on a thumb-key, using Universal USB Creator from Pendrivelinux.com.  You’ll see how easy and safe it is.  Unetbootin, too.
Just be careful, as to the location you designate for installation of the .iso—although modern versions of these two programs don’t seem to allow you to select your WINDOWS C:\ drive by accident.  So they’re safer now than ever before.  Don’t worry about using some slider, or other feature of the USB-creator to set persistence:  with the steps I have described, this should already be in the pipe-line.
7. Step 7:  So now we’re essentially done.  All we need to do is test it, and see A) if the USB key we made will actually run our custom Linux desktop; and B), if the persistence is working.  If these two things work, you’re good to go.

Method 3:  Use a FAT-32 partition you have created in your harddrive (with G-Parted or other tool) to install a customized .iso of desktop Linux, so it can be booted as a read-only operating system—which is to say a “compressed-install”, or “live file-system”.  This will be the equivalent of booting a custom Linux .iso from a DVD or USB thumb-key; but you won’t have to lug the DVD around with you, and the system WILL RUN AT FULL-SPEED.  :
This Method essentially just describes using Unetbootin to frugally install a re-mastered Linux distro to a FAT 32 partition in your harddrive, and establish Syslinux as your Masterbootloader (“MBR bootloader”).  This method basically requires you to also install another bootloader (Grub4dos’ version “grub.exe”) in order to be able to boot Windows Vista/7 (or else learn to adjust the config files of Syslinux with a command-line text editor, as Syslinux can’t seem to “automagically” recognize Vista/7; Syslinux often * will * auto-recognize XP).  (Really, you just copy the Grub4dos version known as “grub.exe” to the FAT 32 partition, so this part isn’t like “installing” a bootloader:  but the Syslinux bootlaoder * will * be installed, with this Method.  So to try to be a little more clear:  with this method, no Grub4dos code gets installed to MBR; but Syslinux code * does * over-write MBR.)  There are other ways, however, to solve the issue of Syslinux’s difficulties with directly booting modern versions of Microsoft Windows (Windows Vista & 7). What I depict here is the way that suits * me *.

There are some (nicer?), more up-to-date instructions for this scenario of Unetbootin at:

http://www.wikihow.com/Install-Linux-without-a-CD-or-USB-Stick-Using-UNetBootIn
PLEASE NOTE that this procedure is at least somewhat of a * RISK *, as it alters boot-loader configuration, and writes some code to your MBR (Master Boot Record)—although one * should * be able to change this back, if one wants.
You use either   A) Windows, or   B) your Linux install to download Unetbootin.  (I’ve only done this part with Unetbootin so far.  And using it from Ubuntu 10.04.  It seems this Method won’t work from Windows 7 or Linux Mint 12 & +, as the interface of Unetbootin available for these operating systems does not let you select  a harddrive FAT-32 storage partition you’ve created.)  Make sure your remastered .iso is in a place where you can navigate to it—you might have to make a copy of it to an internal directory in either your Windows install, or your Linux install.  You should even be able to just create a folder in your FAT-32 partition ahead of time, and name it something like “Backup_isos”, and put the file in there.  As long as you can navigate to it while in Unetbootin, making sure you are selecting the right file.  Don’t, for instance, select the .md5 file by mistake.  But if you do, Unetbootin will probably just give you an error message.
I prefer to use RemasterSys for Ubuntu and its variants, to create a new, custom .iso of my Linux.  This is needed, as we don’t want to have to do the frugal install with the stock .iso image, and then shove all new programs and customizations into the persistent-save folder or partition—because the more for which we make Persistence responsible, the slower things tend to get.  There are other ways to re-master Ubuntu:  but I find this the easiest.
You’ll need to run  G-Parted, or other disk manipulation program with which you’re comfortable, to shrink the Linux install in your harddrive (if you have Linux actually installed to your hdd), and Windows’ Disk Manager to create a “shared” FAT-32 partition somewhere on your harddisk.  You could use G-Parted for the latter also, but I like Windows’ Disk Manager for creating/working-on ms-type partitions on a harddisk.  A FAT 32 partition is accessible from * both * Linux ** and ** Windows, which is another nice thing about it.
Windows usually labels this FAT-32 storage as “E:\” drive—except for netbooks, and devices which don’t have a cd-tray—in which case it usually becomes “D:\”:  but be careful!  If you copy/format to the wrong drive or partition, you could be in big trouble.  Using Unetbootin (running from Ubuntu 10.04, at least), it is possible to just treat a FAT-32 partition on your harddrive as a USB thumb-key.
You could probly even run Unetbootin from a Ubuntu 10.04 live cd for this use, downloading Unetbootin into the computer’s memory.  Run Unetbootin and have it install your new .iso as a bootable “frugal”/”P.M.I.” install, being ** sure ** that Unetbootin is pointed at the FAT-32 partition you created as destination (“target”).  When you then shutdown cold and boot back up, you’ll be in your re-mastered Linux desktop—but as a * compressed *, read-only file-system, which cannot be altered (till you un-install it).  Unetbootin will also have installed Syslinux as masterbootloader on the machine.  There is online documentation for Syslinux, but I found it rather above my head (though, again, you must take into account that * many * things are over * my * head).  Syslinux will often auto-recognize and therefore boot Windows XP all by itself (or so I’m told); but to boot W7 or Vista, it may need to be tweaked by you.  Further, you might have other (experimental?) Linux/*nix installs on your disk, which you wish to boot.  What to do?  The easy-out I found is to just learn to use Grub4dos’ excellent booter “grub.exe”, as a “go-between”, or “chainloader”.
Just go to
http://www.icpug.org.uk/national/linnwin/step1-9x.htm
, and download a small file called grub.exe.
Another link might be
http://download.gna.org/grub4dos/
where I am told you can d/l grub.exe inside a zip file, and then extract it.
This (grub.exe) is just one of the two main versions of Grub4dos, and can be d/l-ed on its own, as a file.  Once you have it, you just copy it to your FAT-32 partition.  Be sure to just copy it to the FAT-32 partition—* don’t * put * grub.exe * in * some * folder * in * there *.  Once it’s in there, when you boot the computer from cold, and you don’t want to boot the Linux frugal install, but rather WINDOWS ( or something else), you can hit the “c” key on your keyboard at the Syslinux prompt—before Syslinux times-out into booting your compressed Linux; then you type “grub.exe”, without the quotes.  Hit Enter, and Syslinux hands boot control over to Grub4dos!  From here, if Grub4dos doesn’t auto-recognize the other os on your disk, you can chainloader it to GNU Grub, if you have GNU Grub on your harddisk.  You would type:
find –set-root /boot/grub/core.img
kernel /boot/grub/core.img
boot
Type a line at a time, then hit Enter after you’ve typed each line.  This chainloaders-you to good ‘ol Grub2-stage 2, which is still alive in (first instance of a) Linux partition of your harddisk.  From there, you can easily boot anything you always could—Linux or Windows.  I’ve only tested this “multiboot-disk” arrangement on 3 machines (all laptops); but it seems very reliable, and does not add much time or work to booting-up the machine, because after a while, you just remember what to type, without much thought.  You may spend a lot of time in your new, compressed iso-install of Linux anyway, which will boot by default, without intervention from you.  And unlike cd/DVD installs of most os (except things like Puppy Linux and KNOPPIX), your “frugal/P.M.I.” install of your desktop Linux running from your harddisk will run * full ** speed *, so it won’t be slow.
As far as persistence—for example, a persistent-save partition, so that you can download and install updates to your new frugal-install of Linux:  with a compressed file-system, why do you need to d/l updates?  The main purpose of updates for me is to protect against getting malwared—even in Linux.  But a compressed read-only file-system is very hard to malware/root-kit/etc. (at the time I write this).  Note that such persistent-save folders are best kept to a very minimal size anyway, just using such for the fewest additional settings changes & rel. that you have to—because they tend to slow the system down very significantly—at least at the time of this writing.  That’s why you install the distro first (to a harddrive partiton or thumb-key), then tweak it, and * then * re-master it.  This “frugal install” arrangement is not that hard to un-do, either:  I’ve done so twice now, I guess because I’m a bit of an experimenter, and am not satisfied with what I’ve created.
NOTE that a persistent-save “folder” (really, people probly most often use a separate * partition * on the harddrive for this—but it’s referred-to as a “persistent-save * folder *” anyway) can be * encrypted *, if you want to.  But you may have to use the .ext2 legacy (non-journaling) partition format for it, as I am not aware of another disk format that will support encryption of a persistent-save folder, at the time of this writing.
If you really want a persistent-save area, you should be able to just follow the steps in Method 2, which regard this.
SO WHAT IF YOU JUST WANT TO DO THIS “FRUGAL INSTALL” THING FROM A LIVE CD, DVD, OR USB THUNB-KEY, AND NOT HAVE ANY LINUX ACTUALLY INSTALLED TO YOUR HARDDRIVE (and therefore not have Grub2 available to chainloader-to, so you can boot into WINDOWS Vista or 7 if you need)?
Well, you can do it this way.  Things might be a little more complicated where it comes to setting-up boot of Windows Vista or 7, without Grub2 available in some folders of a “traditional” side-by-side Linux co-install to harddrive, with ms Windows. Where Grub2 is present, a person can just open a Terminal in its associated Linux install, and run “sudo update-grub”; give the process time to complete (usually about a minute), and Grub2 will then be able to boot any instances of Vista or Win7 on the system, when Grub2 is chainloadered-to or otherwise invoked.  One need only do this update once, until-and-unless another operating system is later added to the system (as a full install, I mean).
But suppose you don’t want to have Ubuntu installed to the computer’s harddrive at all (except, of course, for our “frugal” install to the FAT-32 partition)?
There are various ways to handle this.  The most promising that * I * can come-up with at the time of this writing, would be to just follow the instructions available online for Grub4dos, pertaining to booting Windows 7.  Remember that under Method 3 here, we have copied our download of Grub4dos’ file “grub.exe” to our FAT-32 partition (and * without * putting it inside a folder there, as I said earlier).
Grub4dos’ grub.exe can be used to boot Windows Vista and 7.  And it is easy—* easy *—to start/chainloader grub.exe from Syslinux.  Just boot-up from cold:  when you see the Syslinux bootloader splash screen (you know, with the bald-headed guy), then type “grub.exe”.
While your machine is turned off, press the power-switch, to boot the computer from a “cold” state.  As Syslinux is the MBR bootloader in this scenario, its  boot screen will appear (“bald guy”).  Then we want to get its commandline—this may appear, of its own accord; if not, run through another boot, and this time hit the “c” key on the keyboard, when Syslinux appears.
At Syslinux command prompt, type “grub.exe”, & hit Enter [it is needed to boot Grub4dos’ grub.exe file—which you must already have copied to a vFAT partition.
This should cause what is effectively Grub4dos to load.  From there, you may be presented with an option to click-on, to boot Windows 2000 or XP.  Where it comes to Vista/7, you may have to hit the “c” key on your keyboard, to go to Grub4dos’ commandline environment.  There are plenty of instructions for Grub4dos available freely, online.
One way is to use the Grub4os chainloader+1 command, to try to boot Windows Vista/7.
Try the following:

grub> chainloader (hd0,1)+1
An important NOTE here is, of course, that we understand the particular syntax of how this version of Grub identifies the disk-volume (partition) it’s supposed to attempt to boot.  Grub2 (GNU Grub 1.98, 1.99) has changed to a different syntax format for this way of identifying a partition, and some of us have gotten used to this, and inadvertently try to use the new way in Grub1 (Grub Legacy) or Grub4dos.  And that’s a no-no.
So what you need to do for this, is first understand what partition your Windows 7 os is located on.  It will almost certainly be on one of the “primary-type” partitions on your harddisk, as opposed to a “logical-drive” type volume.  There can only be comparatively few “primaries” on a harddrive (though your laptop might have more than one harddrive—but most DO NOT).  So this narrows it down a lot.  And it HAS to be either FAT-32 or NTFS.  Windows Vista/7 won’t sit in anything else, by default.  So what is typed in parentheses in the above example is the way of designating a partition to be booted by Grub4dos.  The “hd” means “harddrive”.  The zero [hd0,] means the first (and probly only) harddrive in the laptop.  Why not just use a one [1]?  Because Grub Legacy and Grub4dos don’t, that’s why:  they count the harddrives from zero—as “01234567……”, and not “123456…..”.
The comma is just like a decimal point—but in this syntax, a comma is used:  it tells that the next numeral is gonna be the disk-volume (partition) inside the designated harddrive, which we wanna boot with Grub4dos’ chainloader + 1 command.  So you need to enter the right number, that corresponds to your Windows 7 partition.  This number is also counted from zero, not from one, in Grub4dos.  So “ (hd0,1)” means the ** second ** partition on the harddrive “hd0”.  Which is where Windows 7 is likely located:  computers that came installed with Vista/7 often reserve that very first harddisk partition for the “Windows recovery” bootable environment—a sort of “clean clone” of the factory Windows 7 install that can (theoretically) be used to repair Windows, if it got seriously virused, and resort to Safe Mode or other means proved unsuccessful.
So this method to boot Windows 7 using your grub.exe file is worth a try.  Other syntax for doing it this way are available, online.  But research it carefully first.
Remember that you have to then confirm the instruction which you have set-up by then type “boot” at the next   grub>   prompt, and hit Enter.

grub> boot
This command may not won’t work if you specify an empty sector or file, or if the code in that file is corrupted or somehow incompatible. There needs to be some boot loader code in the file or device (sector) you specify.

If it doesn’t work, try a different partition, or try loading the kernel and initrd.img directly.

Method 4:  Good News!  Part of the improvements to GNU/Grub is that now Grub2 can boot an iso.  I will try this myself when time permits me, and post my results here.  Until such time arrives, here’s a link to the best instructions I’ve been able to find so-far:

https://help.ubuntu.com/community/Grub2/ISOBoot

Method 5:  Just use KNOPPIX.  A friend of mine—a real gamer from the good ‘ol Windows 95 – XP days—recently remarked (where I mentioned KNOPPIX in conversation):  “Is that still going?”.  Well, the answer is “Yes, it is”.  KNOPPIX never stopped.  The KNOPPIX Project and Linux distribution have continued right on, unfettered, from the inception of the project.
The DVD version of KNOPPIX comes with most programs you could ever want or need.  It would be easy to install it to a USB thumb-key, or a FAT-32 partition in your harddrive, using a graphical program like Unetbootin.  KNOPPIX also comes with its own (GUI) “frugal installer”, to allow you to “install” KNOPPIX as a compressed, live-image, to a USB thumb-key, or a FAT-32 partition.  KNOPPIX people often will also refer to a “frugal”, “compressed”/”live system-image”-type install, as a “P.M.I.-type install.  A “P.M.I.-type install—“Poor Man’s Install”—I suppose because us not-so-affluent people may not have modern harddrives, with more than, say, 40 Gb. of space—and this “frugal” install takes up * way * less space on any drive.
KNOPPIX, as a matter of fact—unlike Ubuntu and Fedora, et al—does not have a graphical installer program:  there is only a shell-script, and even for those who know enough about harddrives to feel competent running it, the makers of KNOPPIX discourage full/uncompressed installation of their product.  KNOPPIX is intended to be used as a live system-image only.  One could. I suppose, install it (either the full-blown DVD version or the much smaller cd-build (“microKnoppix”) to some partition (on either a harddrive or a USB thumb-key), then add RemasterSys (which is said to support all Debian-based Linux—and KNOPPIX is Debian-based—it is just that it is based directly on Debian—not based on Ubuntu, which in-turn is based-on Debian):  then, re-master the installed KNOPPIX once you’ve added the few Linux programs it didn’t already have, and then use the resulting .iso to do a frugal install.  To a thumb-key, or a partition in a harddrive.
KNOPPIX also has its own, on-board USB-creator, and its own GUI tool with which to set-up persistence, once it has been installed as a frugal/P.M.I./”live, compressed filesystem image”—*** BUT *** I’m pretty sure you have to use one of the “+ ADRIANNE” builds, in order to have these tools available.  Which is okay.  The vocal ADRIANNE audio helper tool for the visually impaired can be switched-off if you want, once you get things going.
KNOPPIX has been built from Day 1 to function as a frugal/P.M.I.-type system, and it does this well.  KNOPPIX is NOT a “rolling release” type distro, unlike most of the Linux distros:  you update it by upgrading to the next release of KNOPPIX, which the KNOPPIX project releases “when it’s time”—i.e. when they think it’s good and ready.  So there is no fixed date for the next release.  And no Updates Manager to bother with [though I suppose a person could install one (?)].
KNOPPIX, as I’ve probly said in the main text-body above, is Debian-based, but is Qt toolkit-oriented, as opposed to gtk + oriented, like most of the other major Linux-based desktop operating systems.  KNOPPIX stuck to its KDE desktop GUI roots, whereas most of the other Linux desktop distros seem to have switched to GNOME (and therefore gtk + orientation) back in roughly the early 2000s.  Back then, this might have mattered, if you were a person with a KDE-oriented distro, who wanted to use an app built with gtk—or vice-versa.  I’m not sure if it really ever mattered back then.  But to-day—at the time of this writing—* it * does * not * seem * to * matter * much *.  Any Linux Desktop Environ I’ve yet encountered doesn’t seem picky, where it comes to running either Qt-based or gtk + built apps.  I’ve researched the following, and none of these seem incapable at all, where it comes to running either type of Linux app—or apps built with some other toolkit, for that matter:  GNOME (2.x, GNOME 3), KDE (3,4), XFCE, IceWM (technically just a Window Manager, but effectively has much of the powers of a full-blown DE—especially if you also install ROX-filer—even if you’re going to use another files-manager as default); Fluxbox, Openbox, Blackbox (more of a mere Window Manager); Enlightenment E-17 (a very good full service DE, though the controls take a good deal of getting used-to); Ubuntu’s new Unity, and LXDE.  ALL these seem fully capable of running pretty much any kind of Linux app—albeit it might be necessary to also install a little bit of “helper code”, to fully enable the flexibility for different types of apps, on certain distros (like maybe Slackware).
Other than that, I’d think you ought to have little trouble, in running whatever type of Linux app you want, on pretty much any of the Desktop Environs just mentioned.
The current releases of KNOPPIX (the 6.x series, at the time of this writing), use LXDE as Desktop Environment, by default.  The use experience, I find, is a great deal similar to Windows 7/XP—except that the wallpaper is very different.  But controls work in remarkably similar ways.  There is a “Start” button, in lower-left, and it yields results similar to ms Windows.  LXDE is a very stable and capable DE for Linux, yet uses very little RAM and CPU/system resources.  LXDE will run well on top of most Linux distros, ** except ** it has been reported that it has difficulties when on top of Ubuntu.

Hope this helped.   –L.L.

Advertisements

One thought on “Newbie Get Started and Migrate to Linux Tip by Tip

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s