Saturday, October 20, 2007

Worried About How to Play MP3 and Divx files?

Want to play multimedia? Want to get Flash and Java? All that stuff used to be a pain in the ass to get going in Linux.

Don't worry. It's only three, easy steps now.

1. Enable the Universe and Multiverse repositories

The software we need isn't officially supported by Ubuntu, but it's on their servers. We just need to tell Ubuntu 7.10 to start looking there.

Go to System -> Administration -> Software Sources and click the Universe and Multiverse entries, as seen in the screenshot below.

2. Install all the codecs and plugins

It's as simple as clicking this link:

Ubuntu Restricted Extras

This will install the Ubuntu Restricted Extras package from the Ubuntu repository (no, you're not downloading anything from here). Type in your password, and you're done.

3. Install DVD playback

The only other thing to mess with is DVD playback. Simply hit ALT-F2 to bring up a command prompt and copy this line in:

gksudo /usr/share/doc/libdvdread3/install-css.sh


Click Run.

You're done and Ubuntu 7.10 should do everything with multimedia that you ask it to do.

Monday, October 15, 2007

Reflections on the Seven Reasons Ubuntu Became Popular

I saw blogs about the seven reasons why Ubuntu became popular and questioning why you would need Windows when Kubuntu is available. I have my own opinions on why Ubuntu became so popular (much more so than Kubuntu) and why it's worth trying.

  1. Ubuntu had a vision: Mark Shuttleworth took a good look at desktop Linux and tried to identify where it was lacking. He decided that eight million applications on twenty-five CDs made Linux difficult for new users. His policy was to distribute a single CD with only the best of each kind of application installed. These applications wouldn't be called by name, but by purpose, reducing the confusion to new users.
  2. Following Standards: Ubuntu puts a lot of effort into the organization of the software, and they follow the recommendations of Freedesktop.org, insuring that whatever they do can be taken and used by anyone else following those standards. Gutsy sees the addition of XDG base directories. Hopefully we will see applications begin to look in these directories by default.
  3. Commitment to progress: Canonical (Ubuntu's umbrella corporation) made early commitments to in areas that didn't have strong applications. Mplayer and Xine were decent movie players at the time Ubuntu came out, but they both skirt the edge of copyright infringement, making them less-than-optimal. Canonical committed to the GStreamer backend early on, even though it had fewer features at the time, because they saw that it would provide a consistent and well-structured way to do audio and video in Linux. The developers worked together with the Gnome project to make this happen. A lot of new Linux distributions repackage stuff and stop at that. Certainly the larger ones like Red Hat and Suse have quite a few developers on staff, but few others add capabilities to applications. Geexbox is a great project and spends an enormous amount of time on the glue holding stuff together, but the developers don't really work on the MPlayer or Xine movie players. Right now, Ubuntu is working hard to bring the Telepathy framework up to speed, which will do for IM, video chat, and telephony what GStreamer did for audio and video.
  4. Easy application installation: Some developers complain about the Add/Remove application (available under the Applications menu) duplicating the function of the Synaptic Package Manager (available under System -> Administration). Synaptic isn't very user-friendly. Sure, it's not difficult to use, but compare it to the simple Add/Remove interface and you'll see what I'm talking about. Add/Remove made finding and installing new software easy-peasy.
  5. Community: Ubuntu made an early and large commitment to a user community. Fedora was always second to Red Hat -- everyone knew it was the place to dump and test all the unstable stuff. Suse (at that time) didn't distribute for free. Ubuntu's "community version" is the same as the "enterprise version," and always will be. There's no per-seat limit -- no CPU limitation. It was free and community oriented. The forums encouraged regulars to sign an agreement on how to approach new users with problems so that everyone felt included. Canonical "got" Open Source from the moment it left the gate. You o not know how rare this is. Think of Linspire, Xandros, and Xara XTreme as bad examples.
Ultimately, Canonical made the right moves at the right time, making a real commitment to good software and a user community. Shuttleworth had great success before Ubuntu, so I'm sure his experience and drive were contributing factors.

Friday, October 12, 2007

So You Want to Know How to Use Anti-virus Software on Ubuntu?

You've got an Ubuntu system, and your years of working with Windows makes you concerned about viruses -- that's fine. While Ubuntu (and Linux in general) is a very secure system, and Ubuntu comes with no "open ports" (that means avenues by which worms can get into your system without your assistance), there is always a certain danger from malicious software. The following is an overview of the entire list of Linux worms viruses and worms known at this time, courtesy of Wikipedia:
Worms
  • Net-worm.linux.adm: This is a worm from 2001 which exploited a buffer overrun (one of the most common methods for viruses). It scans the network for computers with open ports, tries the attack, infects web pages hosted on the system and propogates further. This worm is not dangerous to you because the buffer overruns have been patched for years and you do not have any open ports.
  • Adore: An infected computer scans the network for DNS, FTP, and printer servers, infecting them using various methods. A backdoor is installed and the worm propogates itself. This worm is not dangerous to you because the methods of attack are also from 2001 and have been long patched. Even if the weren't patched, you don't have these services running on your Ubuntu system.

  • The Cheese Worm uses a backdoor which was installed by another worm, removing the backdoor and propogating. It is, in fact, an attempt to clean and already infected system. This worm is not dangerous because the worms it needed to propogate are no longer dangerous. Whether is was ever dangerous in the first place is debatable.
  • Devnull is a worm from 2002 which used an old OpenSSL to infect a system, becmoing part of an IRC controlled botnet. The worm could only propogate if a compiler was present on the system. The vulnerability this worm used has long been patched. OpenSSH is not installed on your system by default.
  • The Kork Worm uses the Red Hat Linux 7.0 print server and needs to download part of itself from a website. That website no longer exists. Red Hat 7.0 is not Ubuntu Linux. You are safe.
  • The Lapper Worm has no information about it at all, anywhere, so I can't give you and information about it, but it was added to the list in 2005, and any vulnerabilities it exploited have almost certainly been patched by now. I can't say for certain whether this worm could affect you or not, but most vulnerabilities are patched within days, not weeks, so two years makes it very unlikely you could be affected by this.
  • The L10n Worm (pronounced "Lion") was active in 2001 and used a printer server for exploit. The vulnerability has been patched and the server is not installed on Ubuntu. This is no danger to you.
  • The Mighty Worm appeared in 2002 and used a vulnerability in the secure session module of the old Apache web server, installing a backdoor and joining an IRC botnet. This vulnerability has been patched, Apache is not installed on your system, and the entire architecture of the web server has changed. You can never get infected.
  • The Slapper Worm used the same vulnerability as the Mighty Worm and operated similarly. You can't get this one, either.
Viruses
  • The Alaeda Virus is relatively recent (May) and infects other binary (program) files in the same directory. If you run as a normal user doing non-programming work, you should not have any other binaries in your home folder. Alaeda won't have anything to infect. This is a good reason why you shouldn't download and install random files off the Internet. If you don't know why you're typing in your password, don't do it. Realistically, though, ELF files (the Linux equivalent of a Wondows .exe) are pretty picky about what system they run on, so sthe chance of getting infected is slight.
  • The Binom Virus is from 2004 and affected ELF files in a similar manner to Alaeda. The same conditions apply here. You chance of getting infected is zilch if you don't give a password, and not much even if you do. Be safe, though, and don't run random attachments.
  • The Bliss Virus was probably a proof-of-concept by someone from 1997 trying to prove that Linux could be infected. Because of the Linux user privilege system and the thousands of versions of Linux, it didn't do well at all. This one is in the same boat as the two others. Almost nothing about the Linux kernel is the same as it was in 1997. Don't worry.
  • The Brundle-Fly Virus was a research virus for an operating systems course and was never in the wild. It even has a web page and an uninstaller. If you want to get infected by a virus, this one is good. You'll need to compile it for your system, though, so be prepare to follow a lot of complicated instructions.
  • The Diesel Virus is called "relatively harmless" by viruslict.com. It's an ELF virus, just like the others, discovered in 2002. No need to be concerned
  • The Kagob Virus comes in two flavors and even contains a copyright notice (2001). There are no symptoms of infection. Interestingly, when run, the virus disinfects the infected file to a temporary directory before running, then deletes the file after it is executed. Same ELF problems as before. You won't get this one, either.
  • The MetaPHOR Virus is another project with its own web page. The exact function and evolution of the virus is laid out. From 2002, it shouldn't represent any risk, even if you can find one in the wild. If you really want to get infected, download the source and compile it yourself.
  • OSF.8759 is the first really dangerous virus on the list. It not only infects all files in the directory (and system files if run as root), but also installs a backdoor into your system. The backdoor doesn't suffer from the problems of normal ELF viruses because the virus itself loads the backdoor. This means that the virus still needs to work under ELF, though, limiting the chance that it will work on your system. Since the virus is from 2002, there is virtually no chance that it will run on your system. If a new version becomes available, you might need to worry.
  • The RST Virus is also from 2002 and also installs a backdoor. It, however, operates under normal ELF rules, making it virtually harmless to today's sytems.
  • The Staog Virus was the first Linux virus, created in 1996. It used vulnerabilities which have loog been patched. It cannot harm you.
  • The VIT Virus is another ELF virus, this time from 2000. Since Ubuntu didn't exist seven years ago, you won't be running a system that old and won't be infected.
  • The Winter Virus is also from 2000 and is the smallest known Linux virus. It suffers from the same problems as all ELF viruses.
  • The Lindose Virus is another proof-of-concept virus, showing how a virus can be constructed to infect both Windows and Linux computers. It has never been seen in the wild. From 2001.
  • The ZipWorm Virus passes by infection of .zip files. When run, the virus infects all other .zip files in the directory. It has no other ill effects. From 2001, it is unlikely you'll ever run across it.
That's the entire list of Linux viruses and worms. Fewer than thirty. Compare that to the estimated 140,000 viruses for Wndows, and you'll understand why people say you don't need a virus scanner on Linux.

The Reality
If you are going to trade files in a Windows world, you'll need to scan those fies for viruses. You won't get infected, but you may help infect someone else. There are two ways to do this:
  1. Run all the files through a server which checks for you. GMail, Yahoo mail, and Hotmail all have wonderful checking software.
  2. Check the files for viruses yourself. You'll need to go to System -> Administration -> Synaptic Package Manager and search for avscan. Install the package. It won't appear in the menu. Run it by pressing Alt-F2, typing avscan, and pressing Run.


You can now scan files (or your entire system) for viruses and worms.

Wednesday, October 10, 2007

Install Deluge and Enjoy Bittorrent Freedom

If you've recently installed Ubuntu and tried to download something via Bittorrent, you'll have noticed that a bittorrent client is installed by default. Just like the rest of Ubuntu's applications, you'll everything you need to operate day-to-day already installed. The basic bittorrent client has few options, though, and downloading more than one torrent at a time can be difficult, and there isn't any way to limit the upload or download speed globally, only per torrent.

Many users install Azureus to get a full-featured bittorrent client, but it is slow and uses a large amount of memory. uTorrent isn't available on Linux, but a nice little Gnome-friendly app called Deluge is!


It can easily handle ten or more torrents without affecting your CPU or memory in a significant way. The Deluge site says
When Deluge was first released in September 2006, it was very limited and lacking in features, but over the last year, Deluge has been one of, if not the most rapidly developed bittorrent client on the web. Now, Deluge is among the most feature-rich clients in development (second only to Azureus, but without the bloat, and ahead of µTorrent according to http://en.wikipedia.org/wiki/BitTorrent_client) , and it does this without the need of tools such as Java or Wine. Deluge was created with the intention of being lightweight and unobtrusive. It is our belief that downloading shouldn’t be the primary task on your computer, and therefore shouldn’t monopolize system resources.
Plugins
Deluge offers a large number of plugins, too, extending the funtionality of Deluge.

Linkage is another client that works well within Ubuntu, but it's still quite new and lacks the features (especially the plugins) that Deluge has.

Good luck torrenting!!!

Webrunner -- Is all the fuss worth the effort?

What is Webrunner?
Webrunner is a fork of Mozilla (the basis for Firefox) which is designed to make web apps like GMail and Google Docs appear to be more like desktop (local) apps. It claims to use native widgets for things like buttons, making the page more closely follow the theme of the desktop. How is it working? Let's take a look.

Webrunner's Look
Take a look at the following screenshots:
GMail
A couple of the buttons at the top of the screen look nice, but the rest of the page looks the same.
Google Docs

Again, a couple of buttons at the top (and no URI bar or menus).
Facebook

The Facebook pages look identical to the non-Webrunner version.
Yahoo Mail

Yahoo won't let us do the Beta (because the browser is "unsupported ... meh!), but even the normal buttons don't appear different.

What does it mean?

Let's compare the appearance of several other applications rendering the web. Konqueror gets the same look without requiring a separate application.

While I think that the idea of Webrunner is great, it's an alpha product that will supply nothing users can't get from Konqueror or Safari right now. Even Epiphany (the default Gnome browser) users get most of it, like the scroll bar in the following screenshot.

Monday, October 8, 2007

New Directories in 7.10 (Gutsy) Causing Confusion

This post shows that a new feature of Ubuntu 7.10 is causing confusion for a few users. Really the feature is an implementation of the XDG Base Directory Specification from Freedesktop.org.

This creates specific directories for user files, allowing XDG compliant applications to search in these locations automatically. It also allows for localization (that means translation into the language of choice) of these directories without losing the benefits of standard location -- it's done on the fly during login and the location of the files is kept in a standard configuration file.

The upshot is that new users can move all their music to the Music directory and expect that newer audio players will know where to look for the music library. There won't be as much of a need for first-time wizards and the like. Rhythmbox will be ready with your music collection the first time you start it. Talk about ease of use!!!

My suggestion to older Ubuntu (and other Linux) users? "Move (or link) your files to the standard locations. Freedesktop.org has improved the desktop Linux experience by ten-fold. Accept the standard and work within it."

Wednesday, October 3, 2007

Video Overview of the Ubuntu Desktop

Crappy, again, but I'm getting better. This one is about 22 minutes and covers the top and bottom panels.

The Google Video

Used Car Salespeople

There's some great conversation going on over the tech boards and tech news sites I visit. They revolve around the use of marketing in Open Source offerings, and whether the little white (or not so white) lies and FUD are just a part of doing business.

Since long before I ever got into computers, the business has been the same. Salesmen resemble the used-car variety. They said whatever they needed to get you to buy. First it was expensive mainframe hardware, which generally came with free software, by the way. Next, once the business computer started in, they wanted to sell you the whole suite of hardware and software. Once the PC became a commodity item, they sold you the software, locked down in any way possible to keep you from moving to other software.

They were able to follow the car sales method because computers and the software were big ticket items, and once you were with a particular vendor, it was extremely hard to move. After the sale, you found out that many of the touted features didn't actually work as advertised or at all, but you'd sunk so much money into the system already that you were committed and couldn't just scrap it all and start over. Even if you did start over, the other vendors were all likely to lie to you, too. Buyer beware, eh?

This kind of business stopped some time back for mainframe vendors. Clusters of PCs can now outperform the heavy metal on many tasks. Regular PCs and servers perform 75% of the other tasks mainframes used to be called on to do. No, the market isn't dead. The people who need mainframes know who they are and have investigated the other options before making the hard decision to lay down the cash for the big machines. Mainframes are still sold and are still somewhat profitable, but the vendors can't lie and cheat their way in anymore. Losing credibility is too important.

The PC market used to be fractured. In the early day, different architectures, busses, and adapters meant that you got locked in. Thankfully, we've settle on indutry standards like ISA, PCI, AGP, USB, and PCIe for some time now. Imagine if your computer came with a proprietary monitor jack which only worked with the vendor's overpriced monitors. That's the way it used to be, folks. No more, though, thankfully. Don't like your monitor, just walk in to any tech store and buy a new one from any manufacturer you want. Graphics card too slow? Buy another from any of several vendors, on-line or off. No room to lie in marketing literature, or the customer will simply return the item and buy from someone else, telling friends how awful the experience is.

Sadly, the practice isn't completely gone: old habits die hard. NVidia and ATI were both recently in the news over advertising features which their cards didn't actually have. They got away with this for a short time because the features they were touting (like Vista and DX10 compatability) weren't available to consumers for months after purchase. Once the consumers discovered the ploy, they'd already used up a good portion of the life-cycle of the graphics card. ATI's reputation suffered for years after it was found that it used its driver to cheat on benchmarks. Graphics card makers seem to be about the only ones left who follow this despicable practice.

Oh, and software houses. We can't forget them, can we? Announced features for future versions of one vendor's software are announced to take the limelight away from newly released software from other vendors' offerings. Microsoft is especially good at this method. Whether those features ever appear in the final product (or even whether the product ships) is not really important. What is important is limiting your competitors' ability to market effectively. Fear, uncertainty, and doubt (FUD) were first used by IBM during the mainframe days, but the method was co-opted by Microsoft (and some say perfected by them). FUD is now an industry standard way to deal with competitors' innovations.

This is why we need real software standards. Quickbooks is a great accounting application, but few people can just pick up and move to another one without losing a lot of old, important data. MS Office is undoubtably the most advanced Office suite available, but what does a business do if it realizes that it doesn't need the advanced features anymore? It can't keep using the same softwar forever: once MS decides the software is at its end-of-life, the business gets no more support or security updates. The company has to move to new software, and data in the old format keeps the business from moving to any vendor other than MS.

We need PDF, HTML, XML, SQL, and other real standards for program data. ODF (Open Document Format) is an ISO standard for office documents. We should be using it. SVG is a standard for vector graphics. Flash is not. We should be using the former, not the latter. We need similar standards for other common formats, like accounting software. Until our software gets onto standards the same way hardware did, software houses will continue to give us the same old song and dance, shining us on while they pocket our money in exchange for empty promises.

Right now, Microsoft is trying to push a software standard through ISO, resorting to buying votes whenever it needs to. This standard is effectively useless for any software house except Microsoft's own. It will give them the illusion of following a standard (getting around government requirements) while still locking the users into perpetual MS Office use. Be vocal. Say no to OOXML.

Other I' Been to Ubuntu Stories

Related Posts with Thumbnails