Well nobody seems to comment on ‘good stories’ – maybe I should rant more often? Anyway, it seems I have a reputation in the GNOME world as being an arsehole, so why not. Threat’s about employers reading the blog? Yeah nice one.
I realise I was going to offend the author of packagekit – but seriously, this is not ready for release as the main package ui on any distro. If what it did it did well it might be ok, but it has some time to go. Maybe he needs some offending to get his arse into gear and make it happen and prove me wrong? It’s out there in the wild now as the primary update interface on a public release of a popular distribution – he’s gotta expect criticism, and he’s gotta expect at least some rants – it’s not like i’m mailing the guy or spamming his blog’s comments – it’s my blog. If the app is busy, it needs to make that obvious, not just sit around for 10’s of seconds appearing to do what you asked, and then come up with a blank list for no apparent reason. I have the fastest net connection I can buy but it was a bit busy at the time – still, why does packagekit not cache any meta data, at least for the current session? How come it takes 100mb to run yum – if that’s all it’s doing – I thought surely it was doing more than that? Installing 1 package at a time is not good enough – computers are designed to run batch processes automatically, why force me to handle it? Why is every operation serialised when many don’t need to be? The machine was fine when running yum by itself (even with the ‘busy’ network), so it isn’t the cpu/memory or network (it wasn’t swapping even running the update thing). You can complain that I need a faster box or net – but I don’t need either for running anything else I want to run.
I’ll just say one more thing – you can’t have it both ways – if it wasn’t ready for prime time you should have asked Fedora not to use it – or by getting it in the distribution you get the exposure and fame and flashing lights – but have to expect the exposure to generate a range of opinions, from positive to negative. It isn’t personal (how could it be, we don’t know each other) – although it is impossible not to take it that way I know.
Ubuntu is mostly fine – but after giving it a pretty good run I think it’s just not for me. It’s too focused on newbies or windowsies, not ‘veteran’ linux users – yes, that is their target of course, but attracting developers wouldn’t hurt them either. Venting my personal frustration in my own blog should be ‘allowed’, and I don’t think I need to ask anyone for permission. Debian is known for making strange decisions by an overly politicised process by strong-willed individuals – my opinions there are nothing new (and that is all I meant by ‘*BSD like’ btw).
And yes Synaptic is quite nice. The only thing I don’t really like about it is that it’s quite slow at searching, and listing packages. I can’t remember what I used on suse (10), but from (a somewhat unreliable) memory i thought it was faster/nicer to use.
Umm, if the network cable comes loose, I’d presume the network would come back online all by itself when it got plugged back in – just like it always has? I certainly shouldn’t be logged in and running a crapplet (NB: i didn’t invent the word) for it to reconnect for example, or run any command as root. Do you run a desktop on a web server just to configure it’s network? Can’t their cables also get knocked out?
I installed ‘everything’ (desktop, web server, developer) but of the 3 applications I use daily on any computer – one wasn’t installed. Of course i’m going to complain about some fluff that is installed that directly affects my user experience that was installed instead – hey at least the man pages and info files are there. And are you saying packagekit is not installed by default in most configurations? I’d never heard of it – how could I go looking through dozens of packages to remove it? I imagine any ‘desktop’ would include the other stuff too. I’ve been there and done that – trying to tune my system exactly how I wanted before I installed it. All I did was end up with a broken system and wasting even more time fixing it at both ends. I rarely even run the disk partitioner anymore either,since i’ve had more than one failed install by trying to get things how I wanted it.
As for mono – I don’t hate mono. Mono is ok, technically quite a feat too – I think the effort could have gone elsewhere personally (hint: that means it’s opinion), but although it might not be obvious, I do know where Miguel is coming from and I ‘get’ what he is trying to do, and it is a good thing that someone is doing it, and that they have plenty of financial backing to pursue it, and the grand vision and enthusiasm for it. Ok, initially I was lumbering with evolution, bonobo and e-tree and thought he was just starting another ill-fated quick-results project he’d leave for someone else to finish! But that was the very early days – I worked on a mono plugin for Evolution after all – but nobody seemed to want it so I gave up. However he had to expect political backlash from many people given he was effectively cloning MS technology … and much later on the novell-ms deal didn’t help – no matter what the (secret) realities are of it, it’s done, and it will always be hanging over the project for some people – WHICH IS A TERRIBLE SHAME because so much time and effort has gone into it and there are plenty of good ideas and technology there. Politics in and around technical projects totally sucks, but that’s the reality. Time will tell anyway – and tends to iron out issues like this by itself.
For me, currently there’s no apps I need that use it (f-spot is a really good app but i dont need it), and I want to avoid the temptation to write .net code at home (odd yes, but indeed true). And yes I do have personal misgivings about any MS technology in general, and specifically any on my Linux box, but that was just the icing on the cake. What I hate is .NET itself. I use it at work. On a Windows box. That’s a lot to hate. At least with mono there’s the potential that one day they’ll be able to address the main memory issue with any vm based system – of having multiple virtual machines running for separate applications. Would putting them all in 1 environment work? They do it for Java for enterprise/backend applications – how come it isn’t used for desktop applications as well? If the whole desktop and most of the applications ran from a single vm it would probably do rather well – probably better than ‘n’ C applications which have to initialise each application toolkit separately and which share only read-only c library data and code between them (and an order of magnitude better than ‘n’ C language engines loading (and compiling) ‘n’ language ‘scripts’ which load ‘m’ language ‘libraries’ and sharing only the memory-mapped C library parts between them). (where by ‘C’ I means pre-compiled memory-mappable code/data).
I do rather dislike python mind you. Somewhat how I dislike visual basic. I don’t really hold strong opinions about any other languages but those two. Well, javascript isn’t that high on my list either, tcl has some issues at that. On reflection maybe the reason is simple – the generally bad experience (IN MY EXPERIENCE – it’s called opinion, not everyone agrees with that opinion, but it’s mine, and I hold it) from vb/python apps. And well, BASIC sucks.
BTW with Fedora, my install was still left with init running at level 3 (vi fixed that). I’m not sure I ever told it I didn’t want X running (I installed a desktop system afterall). Maybe it had something to do with a text-mode install, but I can’t remember being given the option to turn off/on X (apart from setting it up).