No, but just about any aspect of computing benefits from smaller file sizes, or smaller data size in general, starting with CPU caches, RAM caching of files, file transfers, including syncing things over the network, backing things up. > Why does the file size matter? Are the devices you use so short on storage that an extra 100 mb is an issue? All subcommands must be slaved to the main command, and update-alternatives enables this. Here's an old (out-of-date) example script for installing Haskell Platform after building it in /opt/ that demonstrates this. The upfront work part is in scripting the various binaries and manpage files that need to be linked together, atomically, into system directories. It makes both upgrades and rollbacks completely painless, and you get to decide which version you use. It takes a little more work up front, but is worth it. You install them in /opt or somewhere else where collisions won't occur, then use the update-alternatives command to link them into /bin, /usr/bin or other system directories. It allows you to install multiple different versions side-by-side and toggle which of them is called by the canonical system command in /bin, /usr/bin or wherever. My preferred way of installing and managing software on Ubuntu is compiling the source to /opt and then installing it to the system with the Debian/Ubuntu Update Alternatives system:
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |