Ubuntu jingle latest

As some people may know I’ve been trying to make an Ubuntu jingle for LUG Radio. So far this has been nothing short of painful. If you read this post you’ll see I bought 2 new soundcards to try to finish the job.

In the first instance this was to do with my crappy on board soundcard which seemed to enjoy chewing up anything recorded via the mic and line inputs. My laptop was away being fixed at the time so I tried Dynebolic Linux on my dad’s PC (soundcard wasn’t recognised even though it’s a Creative Soundblaster PCI 16, supported by the es1371 kernel module), on my university project machine (similar problem to main desktop machine – crap quality and crackly), on my laptop when it returned (Dynebolic wouldn’t boot with ACPI enabled and I couldn’t seem to use the sound device with it disabled. Haven’t had chance to install Ubuntu yet) and finally on my old Dell Optiplex desktop machine with an Intel i810 chipset and everything onboard, which has a lowly 128MB RAM and struggled to keep up (uses non-standard RAM so I couldn’t drop in a spare stick). All of the other machines were Via chipsets using onboard sound (except my dads PCI 16 soundcard of course).

Nevertheless, the best sound quality came from the i810 machine, so tonight I persevered and mounted the hard disk (dynebolic is a Live CD), copied the files across from my usb drive (causes stutter when running from it) and ran the project from there.

It went great at first but as I layered the tracks it started to slow down as the memory got used up, so much so that by the time I recorded about 4 vocal tracks (I’m simulating a tribe), the machine became barely usable and the kernel killed the Audacity process so I lost everything I had done in the session. I hadn’t saved it as I was technically testing how well it would work.

As Dynebolic is a live CD and runs in memory, I figured that maybe Audacity was saving some kind of user data in /home which is also in memory so I mounted the home partition of the hard disk as /home to see if that saved me a few MBs.

For some reason all this did was make the recorded mic stream slow, bitty and deeeeeeeep. Not to be denied, I unmounted the partition and remounted it under /mnt/home as it was before. I also tried to turn on the hard disk swap partition as swap space, but this failed as it turns out Dynebolic already does this.

Anyway I decided to try again, this time by saving the project every time I recorded a new audio track. I got to about 6 vocal tracks before it started getting flaky so I exported what I had as a wav and called it a night.

I’m not entirely happy with it, but at least I have a proof of concept. I’ll wait until I have my new Soundblaster and make a proper attempt under Ubuntu. At the moment I won’t post it as I’m not sure if it sounds crap or not, you know when you hear your own voice (and accent in my case) on tape? Terrible…

All Soundblasters are not equal

Recent readers will know that I bought a new Creative Labs Soundblaster Live soundcard as they seem to be very well supported under Linux, in a bid to finish this goddam Ubuntu jingle for LUG Radio. Well it arrived today and guess what? It doesn’t work properly under Linux at the moment.

It seems that there are two Soundblaster Lives and they use different chipsets. The older 5.1 is known to work perfectly under Linux using the emu10k1 kernel module and the newer 24 bit 7.1 card doesn’t (the numbers refer only to the number of surround sound speakers, not a versioning process like software release numbers).

Before I bought the card I checked the ALSA website which says here that the Soundblaster Live is supported by the emu10k1 module. I also searched Google for Linux support and read this. Sounds fine I thought at the time.

Until it didn’t work. After searching the Ubuntu forums for the SB Live, I read this thread which points out that the new 24 bit SB Live 7.1 uses a different chipset to the SB Live 5.1 and in fact uses the audigyls module which isn’t entirely working and also only available in v1.06 or above of ALSA, which isn’t available in Ubuntu yet. Why is it you only find this stuff after you buy it?

It also seems that the 24 bit 7.1 card is a piece of shit anyway. Rather than do onboard hardware mixing, it palms it off to the system CPU do do all the work and it is this that has made the driver slower to develop as they had to work out how to do this. Now I know that modern computers like mine are powerful enough to handle this, but would you be happy with a software modem if you were expecting a hardware one? Nowhere in the product spec does it say this.

So I had 3 choices:

  • Upgrade to Ubuntu Hoary
  • Compile the latest version of ALSA myself
  • Or forget it and buy a SB Live 5.1

I don’t fancy upgrading to Hoary as I have yet to hear if it works with the version of ALSA in Hoary. I prefer to stick with a stable version of Ubuntu now I’ve moved over full-time.

I don’t fancy moving to a compiled version ALSA as this means I will have to work out how to put the packaged versions of ALSA on hold in apt and risk making a mess of the sound system by compiling it myself.

So I wimped out and found a 5.1 from Scan for £10 or so. Thats something I could really do without to be honest. It cost me £26 or so for the first card at a time when I’ve just found out I am £3 overdrawn and have no money coming in until early April. I had to transfer money off my credit card to cover my bills for the next 6 weeks.

Well, I’ve made a big noise about this jingle now, I seem to be getting some decent traffic because of it, especially when I was linked by Jeff Waugh on Planet Gnome, Planet Debian and Planet Ubuntu. Also I told the LUG Radio guys about it nearly 6 weeks ago and they having been waiting for it ever since. I’ve had quite a few people post comments about it too so I now feel some kind of responsibility to produce something. God help me if it’s shit…

Don’t know what I’m going to do with the SB Live 7.1 just yet. I might try to sell it on ebay or something to see if I can make some money back, or I might keep it and see if a) it works under Hoary and b) if it is actually a better quality card than the 5.1 despite the hardware mixing cop-out.

One day I will learn that knowledge of Linux hardware support is not innate, nor is it as simple as it looks from a kernel perspective. The ALSA people are doing their job (although it would be nice of them to state that the 7.1 is actually a variant of the Audigy LS and not a variant of the SB Live as the name suggests, by listing the 5.1 and 7.1 separately), but it seems Creative have named the 7.1 based on where it fits into their range and not on what chipset it uses.

Bastards.

UPDATE:

After reading a lot of threads about this problem I filed a bug against the ALSA website to get them to point out that the 24 bit SB Live 7.1 uses the audigyls driver and to ask them to specify that the 5.1 and 24 bit 7.1 are different as it just said that the SB Live uses the emu10k1 driver, so as to prevent other drowning souls in the various support forums around the world from buying the wrong card. They have since updated the site to reflect this. Good of them to research this and actually do it.

Of course Hoary is now out and the 7.1 should be supported, but I have yet to open my box and swap the 7.1 in to check…

Pimping Nvu

There has been talk about Nvu on the Wolves Lug mailing list recently (I still keep forgetting it’s pronounced ‘N-View’, not Un-Voo). Nvu is a redevelopment of Mozilla Composer which got orphaned when the Mozilla Suite was split up into Firefox and Thunderbird.

No doubt, it is the great white hope of WYSIWYG web editors on Linux and other open source platforms. Before I stopped using Windows, it was one of the things where I didn’t know what I would use instead of Macromedia Dreamweaver. I’m certainly no web developer, take a look at my website. Dismal design I’m sure you’ll agree. But the fact is that while I aim to move to proper CMS some time in the future, I don’t care enough about web design to put the effort into doing it all in html by hand in Bluefish or something. My blog is a brain-dumping ground. My website is a link, info and silly email dumping portal for myself and some of my friends. I’m not interested in writing it properly because I don’t have the time. I need a WYSIWYG web editor and a copy and paste function ;).

So I heard about Nvu on the mailing list and took a look. As I said in some pother posts, I started my website in Frontpage Express 5 years ago before I knew anything about web design, then I moved to Frontpage and then Dreamwaver.

Aq, who is seriously (and still an understatement) into his web stuff recommended that Peter Cannon write a review of Nvu as a Frontpage user, being that such a review from himself, a committed and experienced web developer whos uses a text editor, would not be as useful as one from a user of WYSIWYG competitors to Nvu, like Frontpage and Dreamweaver. And in that sense, that includes me.

I’ve edited a few existing pages in Nvu and also created a simple ‘this site has moved’ type page. I have admittedly, not created any substantial new pages. But my immediate thoughts were that it’s pretty cool with one or two things missing.

The main thing is site management. In Frontpage or Dreamweaver you define your local web directory and your remote server details. You edit a local copy of a file and then publish the file to your remote server.

In Nvu you define your remote server and edit exisiting files by pulling them down from the server and editing them locally before sending them back up. This is ok, but limiting I think. My server uses ftp to transfer the files and has a maximum simultaneous connection limit to protect from ftp (globbing?) attacks. As my the top level index.html page has a number of images in, this hits the maximum connections limit and means that I have to edit the file with half of the images missing. This isn’t good for judging the aesthetic appearance of your pages and may leave your page looking out of shape while you edit it.

This working with the files direct from your remote server also means that it’s easy to upload files with mistakes in them without realising.

Another nice site management feature would be some kind of site information cache, like an index of all files and links in the site. If you change the name of a file in the site, then Nvu could ask if you want to update all links within the site that point to that file. It could also point out any files which aren’t linked by any file in the site and any external links that are invalid (ie produce http errors – 404 etc) or can’t be resolved by DNS.

Further to those, or while waiting for the above, I would like it if Nvu could remember where the file I am editing came from on the server. I have more than once downloaded an index.html file from a subdirectory of my webserver, edited and uploaded it before editing the index.html in the top level directory of the site before uploading it again. The problem is that Nvu doesn’t remember where you got the current file from. Instead is asks you which directory you want to put it in, defaulting to the last directory you uploaded to. This is how I overwrote the index.html in my subdirectory with a new top level index.html and left the old top level index.html where it was, unchanged. Fortunately, I also went through the (slightly) laborious task of manually making a local copy of every edited file before uploading it.

One more feature that would be useful would be the ability to use SSH to upload files to the server, fewer and fewer people are using ftp these days, more people are using SSH’s encrypted SFTP method and Nvu should offer this.

Now those are some quite big things to request. As I stated [link removed] on the Nvu developer forums [link removed], I would be happy to write these features myself if I were a decent coder, I had the time and knew what I was on about.

Nvu is in it’s infancy and I’m being quite hard on it by expecting these features already. Besides my comments above, Nvu is a very capable WYSIWYG web editor. As I said, it is the best in its field on Linux at the moment and will only get better.

If you are looking for a WYSIWYG web editor for Linux, or a free replacement for Frontpage or Dreamweaver on Windows, go download Nvu now.

If you are a coder with a taste for web development, then go help the Nvu developers [link removed] right away! Writing my requested features is of course your first task 😉

Why are mobile phones so complicated to buy?

I need a new mobile phone (thats cell phone for non Brits ;)). I’ve never been too bothered about having the lastest, greatest phone as my miserly student finances can’t afford it and my needs are modest. I normally keep a phone for 2 to 3 years, however after about 2 years, my phone is getting a bit flaky and storing all my numbers in phone memory instead of the SIM memory and then at christmas, with no warning it ‘remembered’ all of the numbers I’ve ever deleted so I then had about 2 or 3 numbers for quite a few people. Its also a bit out dated now, monochrome screen, ZX Spectrum style ring tones etc…

So I would like a new phone. Colour screen, polyphonic ring tones and camera are pretty much par for the course these days so I’m not going to save money by not having those, besides I want them. Java games I’m not bothered about, data storage is nice, an MP3 player and radio maybe but not critical. Picture/video messaging I probably won’t use much, but a camera that can save video clips I will. Some kind of Linux interoperability would be nice too (recommendations gratefully received).

Well, getting a phone isn’t a problem. Most retailers will give you the phone for nothing with a pay monthly contract these days unless you want the hottest, newest phone. What is baking my brain is the call time deals.

For example:

Random cool phone handset, free. 200 free anytime, any network minutes. 200 text messages free for the first 6 months, £6 per month after the first 6 months (may be cancelled after the first 3 months). Line rental £5 per month for the first 6 months, charged at £15 per month with the difference rebated at the end of the first 6 months provided you return your rebate claim form within 30 days of the end of the initial 6 month period. £15 per month for the second 6 months, charged at £30 per month with the difference rebated at the end of the second 6 months provided you return your rebate form within 30 days of the end of the second 6 month period. Line rental is £30 per month thereafter, based on a minimum 18 month contract period.

This is a pretty standard mobile phone contract tarriff in the UK, though £5 per month is among the lowest deals around at the moment, but with more expensive contracts, the terms are normally the same. If you want a new phone without a contract then you are paying a few hundred pounds for a handset.

I just want an affordable phone contract without the hassle. For this reason I am staying put with my perfectly reasonably priced contract, although the call restrictions on call time (500 mins per month after 7pm til 7am and all weekend) and cross network calls (50 mins per month after 7pm til 7am and all weekend), plus 100 free text messages for £20 per month, is a little inconvenient. The cross network and pre 7pm calls are what cost me more than I can afford as a penniless student. Thats why I’m looking at these ‘Cool new camera phone’ and £5 per month for 6 months with free anytime any network calls, as I will be able to afford it in 6 months when I have finished uni and the charges go up. Charge me £15 or £20 flat fee for 12 months minimum with 200 cross network minutes and 200 text messages and I’m yours. Make me fill in forms, pay extra, get it rebated, have the price go up in stages, then fuck you.

I would go on Pay as You Go, but that would mean I would never be able to afford to call anyone ever.

What the fuck is going on?

Learning more Googling

We all know Google is the best search engine and I use some of the advanced features quite a lot. My favourite trick is the site: search were you specify a website site to search. I noted on the Wolves LUG mailing list about 15 minutes ago that site: searches don’t like specified subdirectories like http://mailman.lug.org.uk/pipermail/wolves/ which I would like so I could search the Wolves LUG mailing list for specific things, which I do a lot.

Aq kindly pointed out that you can do this with inurl: searches which I hadn’t noticed before, so while I can do site:mailman.lug.org.uk wolves and still get a lot of non-Wolves LUG stuff, if I do inurl:mailman.lug.org.uk/pipermail/wolves/ [search term] then it will search the exact url for [search term]. I didn’t know this and as I told him, this has made my life complete 😀 Thanks Aq.

Now I must blush after telling a few people recently that they need to learn to search Google properly instead of asking obvious, easy to answer questions that are available at the end of a simple Google search. However, on that note, as I have previously ranted, if you learn to search Google, you will learn a lot more and also learn how to help yourself than if you just ask someone else for an easy answer.

The Pleasure and Pain of Gentoo

Heh 😉 I’m gonna have to start thinking of another title for my Gentoo posts.

Well Gentoo is finally installed on my Sun Ultra 10 Sparc64 machine. It went ok really apart from that it has probably taken me 24 man hours or so in 3 sessions. The (Sparc64) Gentoo docs are very good and useful for non-Gentoo specific stuff that I didn’t know. I will be referring to them again. They could do with a few little tweaks, like explicitly stating that the sparc-sources kernel source package is preferable to gentoo-sources on Sparc machines. It’s not as obvious as it might seem as you can use either, but sparc-sources are tweaked for Sparc macines. Fortunately I have a sense of completeness that made me choose sparc-sources straight away, other people had problems with gentoo-sources on sparc.

I started this process again last night and emerged lshw and pciutils (for lspci) so I could work out what was in the box. This sucked in X.org as a dependency for some reason and meant I spent another night wearing earplugs as I was ssh-ed in again from my noisy PSU containing desktop. Meh.

All finished this morning so I decided to change the compilation optimisation from level 3 to level 2 to speed up compilation and reduce the size of the binaries. I then did some work on identifying the hardware, got the sparc kernel sources and cautiously did make-menuconfig.

It actually wasn’t all that bad as all of the sparc hardware options were already selected, I just removed all of the things I didn’t have. I did worry that I didn’t see options for ebus and a few other things but I built the kernel anyway and watched it fail on make modules. Fuck. Google. It turns out that kernel-2.4.29 (the latest version of sparc-sources in Gentoo) fails to build on sparc64 due to missing #defines in dmabuf.c where sound is enabled. Well I only enabled sound support because I hadn’t noticed before that the CS4231 sound card uses a separate low level driver in the kernel, not part of the regular sound system.

Cool. Turned off sound support. Compiled nicely. The rest went pretty much as per the instructions but it’s been one long journey. I still don’t have any nice end-user apps. On the hitlist is Gnome and maybe OpenOffice.org but they are gonna be looooooong compiles.

I think getting the X server to work will be interesting. I have an ATI Technologies Inc 3D Rage Pro 215GP (rev 5c) (thanks once again to a wholesale lspci quote…).

After leaving this post for an hour or two, getting Xorg working is awkward and manual, the configuration tools can’t detect the ATI card, the Sun mouse or Sun keyboard. After some not very helpful googling, and some shot in the dark guessing, I managed to correctly assume that the mouse protocol was busmouse and the device is /dev/sunmouse. Stealing sections of the xorg.conf file from here and here also helped. I just added ATI as the graphics driver and I can get an enormous resolution and moving mouse.

At the moment I can’t see an option to change the resolution in /etc/X11/xorg.conf and for now, Sun keyboards don’t work with Xorg6.8 – they require the deprecated (and apparently no longer supplied) keyboard driver and don’t work with the replacement kbd driver. Hmph.

Gentoo (on Sparc64): awkward and drawn out but thats the cost of doing everything manually and compiling it all yourself. The keyboard problem isn’t strictly a Gentoo thing, thats Xorg going through a transitional period. Next I have to work out how Gentoo startup scripts work so I can make ssh, X, gdm and other things in the future start at boot time. With few other options for my Sparc hardware (though I’m sure after all this installing Debian on it would be a breeze…), Gentoo’s pay off will be in the performance and in the learning I did going through the process.

Ubuntu Jingle Update

After getting annoyed with the frustratingly fiddly process of getting some kind of decent input from my microphone via my soundcard and trying Dynebolic on various machines which run out of RAM as it’s a live CD or stutter because the audacity project is running from a USB micro hard disk (ie slow read/write access) I have bought a new sound card. It takes so long to check this and that with such granularity that by the time I come to the conclusion that I need to mount the hard disk to put the files on and run it from there it’s either midnight and I have to abandon it and go to sleep or I have more important uni work to do. So it was just easier to buy a new soundcard for my main desktop as recommended by Ant in my comments for the original Ubuntu Jingle post.

So on his advice I now have a Creative Labs Soundblaster Live on it’s way. After a bit of research I believe it uses the emu10k module.

Hopefully this will be the end of my complaining and I can get this jingle finished. Either way, I was starting to have other problems with this sound card, I just never worried too much about them before. For example when playing music files the sound would rise and dip randomly. It really is obviously a crap soundcard.

As my cousin once said to me, unless you’re doing real sound work, the soundcard is the last thing anyone ever upgrades. And it is.

Windows is hard to use

I’ve barely used Windows in the last few months and now I have my laptop back I’m stuck with Windows on it until I can sort the crap restore partition thingy out and install Linux.

It’s struck me how hard Windows is to maintain. The amount of calls I get when people explode their Windows installations definitely supports this theory. I have a fresh installation. First thing I do is head to Windows Update and install all of the updates and patches. Then I install Firefox, a firewall and anti-virus. Then I update them. Then I install Microsoft Office (I will be moving to OpenOffice.org on Windows when I am more comfortable with it. By this time I may have finished uni and won’t be using Windows at all). Then I update it.

Then I install all of the million apps you need to make Windows do anything useful. Real Player and it’s horrible ad laden bulkiness, Quick Time, Acrobat Reader and all of the other things I never use. An adware remover.

I think adware and spyware are the biggest threats to Windows users at the moment. I watched a video clip the other day that showed a malicious website installing such malware with no visible output to the user and certainly no asking the user if they wanted to install the software. The guy showed the Program Files directory before and after to show the new software installed. I don’t care if XP Service Pack 2 makes you have Automatic Updates turned on, in my experience people just tell it to fuck off when it tells them that there are updates to install. Just booting into Windows and getting prompted to check all these things for updates is a pain in the arse, so much so that I’d prefer not to use it and the rest of the world have no interest in learning about why they should care, let alone actually doing this, which is why my phone keeps ringing with people complaining that porn and adverts keep popping up and why I get emailed viruses all the time. I have tried explaining it to them…

Windows takes too much looking after and ordinary people are overwhelmed. Even I, as a techincally minded individual think Windows is a hideous, uncomfortable, over-complicated beast that drains me of energy to use. Linux, in my case Ubuntu is a case of hitting reload in Synaptic and then Mark All Updates then Apply or whatever. To install stuff, hit search, find your package and choose install. Imagine having all the Windows software you might possibly want to use in a searchable list with an install button next to each one and an update button to get the latest version of everything you have installed, all at once.

But people want Windows. I think this is mainly due to the PR thing. Like people saying they want a ‘Pentchinum 4’ because they’ve seen it on the the TV and their friends have one. I think if Linux were more able to play MP3’s, DVD movies, Real, Quicktime, DivX, XVid and so on multimedia formats out of the box, then the only real reason to use Windows would be for games. But if thats all you want a computer for then buy a console. But try explaing that to people…

I really must get around to testing Ubuntu on an innocent bystander.

Do you play Wolf-ET?

Do you listen to LUG Radio? If so you should join in with clan.lugradio.org.

We play on Wednesday and Sunday nights from 8pm UK time. I’ve played for the last few weeks but the numbers are dropping. There were 3 of us this week and I had to bail early.

If you listen to LUG Radio and play Wolf-ET, please join in.

A Perfect World?

Is it the nature of being computer people that makes us frustrated by real world inanimate objects’ refusal to do what we ask of them and means we are also the only people in the world that actually know how to follow instructions and put things like furniture or toys together?

I know Jono feels like this. Ask him about having to move his cooker with the broken door that refused to stay closed when he was moving house.

Inanimate objects drive me to distraction. They’re sooooo fucking stupid. I think this might be to do with the fact that I spend all day interacting with an idealised abstraction of a real world environment which behaves in the same way all the time. You can’t drop something on the floor when you remove the item that was on top of it from the fridge when you’re using a computerised environment, that kind of clumsiness is already taken care of for you. If this were going to happen, you would probably get a nice little warning message asking if you really want to drop the chicken on the floor, or leave it in the fridge. I believe Jono calls such idiotic objects ‘infidels’.

I also thought over Christmas that I was the only person in my house that knows how to follow assembly instructions for new furniture or my neice’s toys. My mum just gives up at the point electricity becomes involved, especially when she has to unplug something to plug the new thing in. Not for her to follow cables to see where they go, no. My dad is pretty good at putting stuff together but he is easily misled. I seem to be the only one who can do stuff like this without getting confused. I wonder whether this is also to do with my computerised existence. I spend a lot of time reading howtos, man pages, walkthroughs and so on.

Are we living in an overly perfected world that makes real life frustrating?

The Art of Gentoo (on Sparc64)

Further to my question over whether Gentoo was worth the effort, I decided to actually install it. Somewhat prompted by Ron‘s insistence a while back that Gentoo is great, a chat with a guy called Mark Welch from uni and also from Fizz‘s comments.

I got frustrated over Christmas that my degree doesn’t cover anything that doesn’t run in Windows and therefore on x86 hardware and so I bought an old iMac and a Sun Sparc Ultra 10 workstation. (Sidenote: man is CDE butt ugly).

Well I must have been well treated by Linux because I couldn’t work out how to turn the DHCP client on in Solaris 9 (never used Solaris before) and as we all know, computers are pretty fucking boring without a net connection these days. Yeah I could have figured it out in the end, but the management console was starting to fail to open and a few other things so I figured I’d never use Solaris for anything anyway and decided to install Linux on it. I think Sun are sending me a copy of Solaris 10 for entering some competition or other anyway.

It seems nobody really does a mainstream Sparc64 Linux anymore apart Debian and Gentoo. Debian is my distro of choice but I’m not really sure whats in the box so I need hardware detection and I can’t be arsed to wait 9 months or however long it’s going to take for Sarge to appear, I don’t think they’ve even gone into a freeze yet. So it’s Gentoo.

And well, it seems cool but not one you’d give to a beginner to install. I chose the stage 2 Live CD method as it offered the most control without having to know all my hardware.

Hmm I had to fudge some things. All of the hardware worked out of the box so far. I could probably do with tweaking the hard disk performance with hdparm but I’ll worry about that later when I’ve had time to learn whether my hdparm out was any good and how to tune it.

livecd root # hdparm -tT /dev/hda

/dev/hda:
Timing O_DIRECT cached reads: 716 MB in 2.00 seconds = 358.00 MB/sec
Timing O_DIRECT disk reads: 38 MB in 3.08 seconds = 12.34 MB/sec

livecd root # hdparm /dev/hda

/dev/hda:
multcount = 16 (on)
IO_support = 0 (default 16-bit)
unmaskirq = 0 (off)
using_dma = 1 (on)
keepsettings = 0 (off)
readonly = 0 (off)
readahead = 8 (on)
geometry = 38792/16/63, sectors = 20020396032, start = 0

I’ve never really bothered to compile stuff apart from my own kernels and a few things that weren’t packaged by Mandrake when I used it years ago, so I’ve never learned about compiler optimisations and whatnot. I opted for:

USE=”X gtk gnome alsa -kde -qt”
CHOST=”sparc-unknown-linux-gnu”
CFLAGS=”-mcpu=ultrasparc -O3 -pipe”
CXXFLAGS=”${CFLAGS}”
MAKEOPTS=”-j2″

I have no idea how good a choice I made (advice gratefully received) but ultimately when I know what I’m doing I’ll rebuild the entire system. I will use it mainly as a backup desktop machine running Gnome. I might use it as a home server later on.

I let mirrorselect choose my mirrors for me and the performance was dismal. For some reason, all my mirrors were in the Netherlands (apparently Holland is only part of the Netherlands and it annoys the hell out the Dutch that people think the country is called Holland) but all my downloads were coming from Korean and Taiwanese mirrors which took 3 minutes to time out, which they did a lot. After 50 minutes I had about 12 packages so ‘control-C’ed emerge and manually added the British Blueyonder mirror to /etc/make.conf and the whole lot came down in about 20 minutes. (Note to self: bash filename auto-completion doesn’t work in a web browser window ;)). I use Blueyonder as my Debian apt source and know I can get a sustained 59KB/s transfer on a 512Kb ADSL link.

I left it compiling overnight as it was about 2am when I started and it was all done when I woke up. I stupidly forgot to log out of the SSH session on my desktop machine with the loud PSU and run emerge system locally on my Sun box. I had to wear earplugs overnight…

But it all went fine. Now I’m at the Configuration File Protection and Configuring the Kernel stage but I just don’t have time to absorb all of this reading and sit there and set it all up. I still don’t really know whats in the box, lsmod only lists ext3, jdb and openpromfs so everything else must be compiled in to the kernel image (doh, must remember to use lspci…). I did note from dmesg that I have a Sun Happy Meal ethernet card 🙂 That made me smile, I’ve seen that in the kernel source over the last few years and thought it was a cute name for a network card 😉 I wonder if Sun or McDonalds came up with it first.

So I now have a half complete Gentoo installation, I just have to do the reading and finish it off before, I assume, installing all the apps that I want and worrying about booloaders and stuff. I think that will be a(nother) weekend job…

I bumped into Fizz last night actually, we were both pretty drunk and he asked me if I’d read his comments. I think we mumbled to each other for a few seconds about Gentoo. I think he was more interested in the girl I was with to be honest but it was still good to see him and exchange drunkitudes 😀

Blogging is bad for your academic productivity

Trust me I know. My performance has nosedived since I started reading blogs. Admittedly I am far more interested in what I am picking up from blogs than I am in writing right outer joins in Oracle’s not completely ANSI standard SQL dialect or the economic impact of IT globalisation and offshoring. There is no Linux on my degree. There is on the years that follow mine, my year was the last of the old degree scheme. Isn’t that really weird? In an era such as this my only academic contact with a non-Windows operating system is telnetting a Solaris server to use Oracle.

I’m a Linux guy and blogging is far more interesting. Just don’t tell my lecturers…

Novell produce another crucial open source app

Not content with vying with Canonical for hiring some of the best and coolest open source hackers out there, Novell has offered yet another gift to the open source community by anouncing the Hula Project.

Hula is a web based calendaring and email server somewhat akin to Microsoft Exchange Server, a system that has been lacking in the open source world for years. Although many projects have claimed to offer similar features to Exchange, none have yet to offer a clean implementation or crucially, the shared calendar functionality. They have their eyes on some really cool features like viewing via rss feed and interacting with it via your mobile phone. It is worth noting that Hula is still in the planning stage and has as yet made no releases.

A lot of people were worried when Novell bought Ximian and SuSE, including myself, thinking that they would just get swallowed up in corporate bullshit and slowly die a quiet death. It appears not to be the case.

I’ll try to refain from saying stuff here that I have been intending to use in an article entitled “Why Linux is Good News for Everybody” (feel free to hire me to write this for your publication, email to drinky76 at yahoo dot com ;)), but a large part of this revolves around why Linux is so important to people like Novell, IBM, Sun, HP, Intel and Oracle. They all have products in a shrinking market with one main competitor.

Novell realised they were dying and pretty soon they wouldn’t exist. Novell network and directory services ran on Novell Netware and Windows, but nobody used them on Windows any more and people weren’t buying Novell Netware. People were buying Windows and a few people were buying Unix, but Unix vendors were painting themselves into a corner. But a hell of a lot of people were looking at Linux as the new cool Unix, a possible investment and oen to watch as a future competitor to Windows if not the future of the operating system market. What to do?

Make Novell stuff run on Linux. How to do that? Hire the right people, buy a Linux distributor with the right profile and buy another Linux company that look like they are pushing Linux in the right way. Red Hat are too big to be bought, you can’t buy Debian, what about SuSE? SuSE are about the right size, have a sizeable market and have the right kind of corporate profile. What about hiring the right people? Well, Ximian are doing some really cool things and have some of the best most focused hackers out there – Miguel de Icaza, Nat Friedman and so on.

And they did. But they also realised something important that a lot of the big guns miss. You can’t win with Linux by just doing your own Linux and go at it will all corporate marketing and PR guns blazing. The community won’t give a fuck about you and you won’t get anywhere without them. You have to do it right and you have to get the community on side. How to do that? Well, if you have read anything about the open source community, it is characterised in part by the concept that if we all give something to a project (code, patches, money etc), we all get something greater and more valuable back as a whole system. Novell spotted this and decided that the only way to win with Linux was to give the community what it wanted. Open source several high profile and highly desired applications (Ximian Connector, YaST), pay people to work on what love (Mono, beagle and so on) and pay people to work on what was sorely needed (an Exhange replacement among other things). All these things add up to more and more pieces of the jig-saw dropping into place for an open source equivalent to every app in every bedroom/office/server room.

To win in the corprate field, they also realised that they needed to offer Linux services that very few have might to provide. Software support and training support. These are big things in the professional world. The thing that scares people most about deploying Linux is that they need somebody to call when things go wrong and someone to take responsibility for it. For Linux to take off, there also needs to be a groundswell of Linux expertise. Linux has always been a bedroom hacker’s system, but how can you prove that a bedroom hacker is skillful enough to run your IT infrastructure? Training and qualifications. Novell offer all of this. Wow.

There are big things happening in the open source world at the moment and the future is exciting, damn, I can’t wait to see what we have in the next 12 months. Gnome looks like the future of the desktop to me and I’ve only been using it for 3 weeks. Stuff like Beagle, Xgl and iFolder look like great apps and show clear, ahead of the game, thinking outside of the box. Windows users won’t see this kind of stuff for maybe 2 years. I wonder how many more of them will be using Linux by then. Novell and Canonical (via Ubuntu) are really pushing Linux where it needs to be heading and Novell are paying for a lot of the core pieces of software to be developed.

Bravo Novell, although I still think Ubuntu is the one true way forward, I might try the Novell Linux Desktop at some point.

The Point of Gentoo

Is… Umm…

Well a few of Wolves LUG are using Gentoo and think it’s great, the main draw seems to be the package management system. I have to be honest, I’ve been using Debian for a few years (and Ubuntu more recently) and am in the Debian way of thinking. Package management is pretty core to how I evaluate a distro these days. Apt is just the business. So Gentoo uses Portage. The idea being you get your package source repositries and build the packages from source in an efficient, well managed way. It’s a great idea, but whats the point? Why build everything from source?

Well every package is compiled on your own hardware and therefore is optmised for your own machine. Great. But it takes ages. I was told that on a very fast modern system, building all of the packages to make a fresh install takes a weekend. You can expect stuff like Open Office.org, KDE, Gnome or X to take around 8 hours each. Phew.

The point of this post is the argument about precompiled packages versus locally compiled optimised packages and whether the performance boost is worth the time lost compiling. While your software is optimised for the machine it was built on and hence runs a lot faster than any pre-compiled packages, is the gain in resposiveness worth the time lost to compiling? Sure you can still use the machine while you compile, but still, what you lose in compile time will you get back in response time? In a desktop environment, you can probably compile and continue to work, everything will just take longer, but what about a server?

I can see the point in an environment where the software must run fully optimised for the hardware, but what do you do at update time? Take the performance hit of compiling new updates? Won’t that throw off the whole performance thing? Sure, it was said that where this is the case you have a backup machine which runs while such updates are going on. But isn’t this a sidestep? What is more expensive? 2 machines or 1 better machine?

It’s a fantastic idea if you like to know your software is running as fast as it can, but is it worth the hit at compile time? I don’t think the speedup you make is greater than the time you lost compiling.

This of course is just an opinion and I do aim to take a look at Gentoo sometime soon…

Laptop Update

So, regular readers (heh ;)) will know about my Tiny laptop saga. Well despite their 7-10 working day return time, 4 weeks from the day theI logged the fault call, I got my laptop back. And well, it’s great. Nice new keyboard and the processor fan doesn’t make the whirring, clipping sound it did the last time they changed it, so, all cool.

Except that after twice phoning me and telling me Windows needed to be replaced at the cost of ~£60 which I refused, they installed Windows XP Home anyway. Weird. It hasn’t shown up on my invoice or bank statement so it looks like they just did it anyway. We like free stuff :D. The fact that it’s Windows makes it less pleasurable, but well it now means I have a spare Windows license to put on something, which isn’t such a great thing, but at least I have it if I really need to.

When I logged the fault call they asked me for my Windows license key and I said I didn’t know because I had to wipe it. I forgot that the Windows sticker was on the underside so they must have just banged a new copy on and used my old license key. Why didn’t they just do this before instead of trying to charge me for it? Bastards.

Well, anyway, so now I’m back to square one. I have a copy of Windows on my laptop and want to put Linux, more specifically Ubuntu on it. I still need Windows to do some uni work and would prefer to use the copy already on there. So I’m back to this irretrievable partition problem. I think I’ll just set up the Windows installation how I want it, then Ghost the partition and use it to write the partition back after I wipe the disk and repartition.

Let the day soon be that I no longer need Windows…

Linux at the forefront of a desktop graphics revolution?

Wow. Look at this post from Nat Friedman.

Xgl is a new X server using GL 3d acceleration. I think Nat’s blog explains it all better than I can.

I don’t know a great deal about graphics subsystems, but I think this kind of idea is just the kind of great thinking that will make Windows and Mac OS X users sit up and take notice. I know Microsoft have a lot of new graphics stuff in mind for Longhorn, but surely people will be using this first and this will probably far more powerful. OS X looks great but I’m not sure how far their graphics subsystem goes. Could it do stuff like this? I don’t think so.

Eye candy rules in desktop world. If I could demo the possibilities of this to my dad he’d want it and that means a lot to me, I want everyone to want Linux.

I meant to write something a little more profound than this, but just like Nat, I’m really tired – it’s late and my thoughts are just starting to slow down.

Just go look at it.

Wow, I’m popular

I set up my website http://www.drinky.org.uk/ a long time ago. I started in Microsoft Frontpage Express (this was before I knew what Linux was). As everyone that ever used it will know, Frontpage Express was utter crap, but I didn’t know any better. Today the same basic design exists as I’ve never had the time to rewrite the whole thing, though I’ve been meaning to move it over to some kind of CMS for some time. I basically use it as a dumping ground and as portal for when people ask me stuff. It’s a terrible, disorganised, ugly mess.

Because it’s so poorly put together I’ve never really pimped it and having been living under illusion that nobody ever really reads it for around 4 years. Until today.

I have this workshop to do for a uni module. It basically involves checking the Apache server logs on the uni webserver and grepping the output for your own site. Until about a week ago, my entire website was hosted on the uni server with DNS forwarding to point at the relevent place. So I expected some kind of logs. What I didn’t expect was todays logs to scroll off the terminal for a few minutes. On a busy site maybe, but not for my pathetic effort. How wrong I have been.

It seems that the Big Snake (not suitable if you are squeamish) is popular with the employees of Samsung in Korea and quite a few other people. Why I don’t know. I’d forgotten all about it. It’s a verbatim copy and paste of an email I received about 2 or 3 years back, complete with annoying caps-lock on text. All I can assume is that someone must have come across it and emailed a link to a few friends who then forwarded it on and on and on, it must be doing the rounds in Korea at the moment. Bizarre. But aside from that, my site has been getting a regular hammering from all over the world. I’m really surprised. I didn’t think anyone read my site at all.

So, having moved my hosting over to the account kindly provided by Sparkes, I decided to check the logs for my new hosting and my blog subdomain. Wow. Obviously not as prolific as the old one, it’s only been up for a few days, but still busy. And then I noticed something really cool.

Some of the biggest referrers are Planet Gnome, Planet Ubuntu and Planet Debian. Holy crap. People on some of the coolest blog syndicates are reading *my* blog. In the last 24 hours. Jeee-zus.

So off I went to Bloglines to have a look as I’m subscibed to all of those planets. Planet Gnome first as it was the biggest referrer. It seems that Jeff Waugh has linked to my Ubuntu Jingle post. That really freaked me out. Nobody really knows I’ve got a blog yet, apart from the LUG Radio guys. Jeff was interviewed at some ungodly hour of the morning by the LUG Radio team, the day after Australia day and still managed to be intelligent and entertaining.

So. Shit. I’ve been linked to by one of the coolest guys in the open source world. That really made my day 😀 Guess I have to finish my jingle now…

I’m watching the Brit Awards…

I’m going out in a minute but I’m watching the Brit Awards (the British music awards).

Best British Rock Band…

Surely I can’t be the only person in the world that is able to recognise the fact that Franz Ferdinand are shit.

Can I?

At least it wasn’t The Darkness. Ugh.