On other pages: Some Miscellaneous Ubuntu ConsiderationsOr return to main index page. To set the default application for a given file type--say, for example, to always use gThumb for .png image files:
After that, whenever you double-click a file of that type, your chosen app will automatically be used to open the file. You can also, on a one-time basis, open a file with some other app by doing the same thing but unchecking the "Remember" tickbox. (That can be handy if, for example, you want to view an HTML file instead of editing it, or vice-versa). You can get some basic usage information on any command by opening a terminal and executing command --help (where, of course, "command" is the command of interest). You can get much more extensive information (often more than you want) by instead executing man command, which will display the Manual page for that command (which will probably much exceed your screen's length, so you will have to scroll through it with the cursor and Page keys). The way 'nix operating systems construct their directory "tree", everything that is specific to a particular user will be in that user's home directory, which will be under the main /home directory: for example, user algie will have a personal home directory at /home/algie. The idea is to separate out user files from system files. There is, however, an important exception to that guideline: the local .gvfs hidden directory in every GNOME-system user's home directory. The contents of that subdirectory (the "gvfs" stands for GNOME Virtual File System--don't ask), which some feel should be a system directory somewhere else, are not anything the user creates, has control of, or--most critical--should ever try fooling with. Most important is to make sure that subdirectory is omitted from such things as desktop-search indexing, rsync backing up, and like activities. If included, it will at best stall the process concerned, and at worst might destroy some files elsewhere (such as on your backup medium). In the indexing and backup procedures in these documents, we expressly avoid .gvfs, but be aware of the issue in other uses. If you aren't sure whether some internet activity is slow because the remote server is slow or because your connection is dodgy, open a terminal, execute ping google.com, and watch the results. If your basic connection is sound, you'll get a march of lines down your display, each ending in time=xxx ms What you are looking for is 1) that the progression of lines be steady (one per second) and not intermittant; and 2) that the times be fairly short (ideally well under 100 ms, but at least not over, say, 500 ms or so); amd 3) that the times be pretty similar. This is not a delicate measuring method: it is a quick, simple, crude glance at your current state of connectivity. Many ISPs, from small and local to national, give service that can vary drastically from hour to hour, or even minute to minute, so if you're seeing an apparent slowdown, try this test. (If the pings look good, then you know that it is whatever you are trying to reach--say, the supplier of the slow-to-load web page--that is the issue.) You end the session by pressing Ctrl-C, or you can just close the terminal down. We suggest Google because they are a pretty much fixed resource; you can ping anything you like, but the idea is to choose something whose supply ability can be assumed huge and reasonably constant. "Ping" is a utility that sends out a single packet of data over the network and waits for a response, timing the delay between send and receive; it thus tests the instantaneous speed of your connection to the target host. There is something called Networking Tools that includes, along with a lot of stuff the average user won't want, a graphical Ping tool, but calling it up is a nuisance, and it has a habit of getting locked up when there are connectivity problems; the simple terminal Ping is a lot easier to use. Another useful resource (mentioned on the site linked in the note above) is speedtest.net and its new sister site pingtest.net; Speedtest not only does speed tests, with a source of your choice from an extensive, geographically wide selection, but stores the results for you for ongoing comparisons. All available "speed tests" have their shortcomings as far as exactly measuring true speed goes, but sticking to this one allows comparing results now with results then on a consistent basis, which is nice. A "Domain-Name Server" is an internet resource that turns the familiar text-style address (such as www.google.com) into its real IP (Internet protocol) address, which is a clump of four numbers separated by periods (such as 74.125.127.99). All internet applications resort to nameservers whenever they do anything on the net. The nameservers you use are normally something set up for you (or at least given to you) by your ISP. There is, however, a neat little application named NameBench that conducts speed-of-response benchmarking tests on publicly available nameservers, using (though there are other options) your actual internet-use history (as saved in your browser), so as to roughly replicate the kinds of calls you actually make. It is available through the Ubuntu repositories as just namebench, and is, we think, worth getting and trying. Switching DNS'es may materially speed up your typical web-page-load time (most pages requires quite a few DNS calls, for such things as images, CSS, and so on). (How you set or change DNS on your system we do not cover here, but it's rarely difficult--try the info in the link.) One of these days soon now, IP addresses will be six numbers--the so-called IPv6 scheme--as the world is, amazingly, running out of 4-number (IPv4) combinations. You can send unobtrusive little popup messages to the desktop very easily (if you installed the libnotify-bin package we recommended); that is a handy feature to put in, for example, at the end of a shell script (such as a backup) to let you know the job is finished. The messages stay up for a few seconds (you can control how long) then disappear with no need for user interaction. To see the syntax, just open a terminal and execute notify-send --help or try notify-send 'Heads-Up:' 'This message came from notify-send' Sometimes a .deb package you want to install will cough and refuse to go in, because it is a 32-bit package and your system is (we assume, based on our recommendations) a 64-bit one. The clue is a warning message from the Software Center that reads: Error: Wrong architecture 'i386' All you have to do then is "force" the "architecture":
That should complete the installation. You can now close the root browser. And you have learned a new (perhaps) trick with your browser. Keyboard shortcuts within applications can be changed by pressing keys while the mouse is hovering over a menu item in the app, provided that editable shortcuts are enabled in your desktop session. To enable such shortcut editing in GNOME, you have to edit a "secret" option, either by executing in a terminal-- sudo gconftool --type bool --set /desktop/gnome/interface/can_change_accels true --or else edit the desktop --> gnome --> interface: can_change_accels option with gconf-editor tickbox setting in the Configuration Editor (under Applications - System Tools). We formerly recommended the Alexandria library-cataloguing package, but we find that the web site LibraryThing does the same thing somewhat better. There is, past 200 books, a modest fee, but it's well worth it. If you have an audio-CD collection that you want to catalogue--not "rip" or "tag", just catalogue (as with, for example, classical music)--the tool you want is the set of little utilities packaged in the repositories as cdtools. Once they are installed, you can make a simple little script as follows: #!/bin/bash cdclose echo "Querying freedb . . ." cdown >> filespec cdstop echo '=============================' >> filespec cdeject echo "Insert another CD or close the drive." There, filespec is some file in which you want to keep the list of your CDs. You can, if you have the desire and the skills, do more than just echo the output of cdown to that file. If, for example, you know some PHP, you can get the raw record from the online database (by calling cdown with the -r parameter) and doing some processing; and you can store the data in individual files, one per CD, identified by their unique database "discid". Indeed, you can include the commands listed above as PHP commands, using exec or some similar call. There is a man page for the cdtools package, and also man pages for some of the individual tools: all are worth careful reading. It is a very handly little package. Or return to main index page.
---=== end of page ===--- |