Archive for October, 2000

Wednesday, October 18th, 2000

David O’Toole writes:

[...] Looking at stuff like this
makes me get just a tiny bit upset about how badly
the linux world is dragging its political feet
with respect to improving the interface. I’m not talking
about making
all the OK buttons respond to the Enter key
(currently my biggest pet peeve about GNOME, and it’s slowly
fixed—recent GIMP etc.)

I’m talking about the imaging model. I don’t
want to criticize X unfairly. The X Window System was
brilliant for its
time and in its environment. But it simply does
not support what people want to do now well enough to
Fast vector imaging, transparency,
high-resolution monitors, antialiasing. Yes, you can
implement software on top
but there’s no standard and it’s slow.

The first defense I hear all the time is network
transparency. I respond: who cares.

Well… I, for one, care very much about the network
transparency of X. I am currently typing this from a
Solaris machine on which I have other windows displayed
remotely from a Linux machine and other Solaris machines.
Not only some XTerms and Emacs that could also work over
telnet/rsh/ssh, but also graphical applications like Purify,
Quantify, Netscape, XMMS and some other goodies. They are
all on the same LAN so speed is not really an issue.
Without X’s ability to display anything anywhere, writing
and debugging my programs would be much harder.

So maybe I am among the 1% of people who really use the
remote displays and would not be satisfied with text-based
remote logins. This does not mean that nothing should be
done for the other 99% who would like to get a much better
performance from the applications that are running on the
local display.

I don’t think that it is necessary to throw X away and to
start again from scratch. The DGA extension (available on
OpenWindows and XFree86) proves that you can get decent
performance out of X, although this requires some specific
code that is rather ugly and not easy to write and
maintain. Most programmers do not want to write some
additional code for specific X extensions, and indeed they
should not be required to do so.

But it would be possible to get a better performance
while keeping the X API. Imagine that someone modifies the
shared X library ( so that if the client connects
to the local server, all X calls which are normally sent to
the X server over a socket would be translated into some
optimized drawing operations accessing the video buffer
directly. The shared X library would more or less contain
some bits of the server code (actually, a stub could dlopen
the correct code). If the X client connects to a remote
server, then the X function calls would fall back to the
standard X protocol. All clients that are dynamically
linked to that modified library would automatically benefit
from these improvements without requiring any changes to the
code. So it can be done without throwing away the benefits
of X.
Actually, I believe that some people are working on that for
the moment…

Wednesday, October 4th, 2000

Question: maximum information density in the
print-scan process?

Does anybody know how much information can be stored and
reliably retrieved from a piece of paper, using a standard
printer (inkjet or laser, 300dpi) and a scanner (1200 dpi)?
Since a piece of paper can be affected by bit
(literally) and can be damaged in various ways, some
error correction (e.g. Reed Solomon) and detection (e.g.
CRC) is necessary. Also, I do not want to rely on
high-quality paper so I have to accept some ink diffusion
and “background noise” introduced by defects in the

I found some references to 2D
(such as DataMatrix,
and others) but these codes are designed to be scanned
efficiently by relatively cheap and fast CCD scanners. I am
not worried about the scanning time (I am using a flatbed
scanner) or the processing time (I can accept some heavy
image processing). Also, I would like to encode raw bits
and pack as much information as possible on a sheet of
paper, regardless of its size. These 2D barcodes have a
fixed or maximum symbol size and it is necessary to use
several of them if I want to fill a sheet of paper, wasting
space in the duplicated calibration areas and guard

PDF-417 has a maximum density of 106 bytes per square
centimeter (686 bytes per square inch, for you retrogrades),
which is quite low. It is certainly possible to do better,
but I would like to know if there are any standards for
doing that. I am especially interested in methods that are
in the public domain, because most 2D barcodes are patented
(e.g. PDF-417 is covered by US patent 5,243,655
and DataMatrix is covered by 4,939,354,
and 5,124,536).

If you know any good references, please post them in a
diary entry (I try to check the recent diaries once a day,
but I may miss some of them) or send them to me by e-mail:
quinet (at) gamers (dot) org. Thanks!

Hmmm… This is a bit long for a diary entry. But I
don’t think that such a question deserves an article in the
front page. If you think that I should I have posted this
as an article, then send me an e-mail and I will re-post
this question and edit it out of my diary.

Monday, October 2nd, 2000

I posted my
on using GdkRgb in Ghostscript, in the LinuxToday
about Raph‘s open
letter to the Ghostscript community. IMHO, GdkRgb is the
best solution and those who see it as an attempt to force
them to use “Gnome stuff” on their desktop do not understand
the way GhostScript works or what GdkRgb is.

This is not new, but it looks like anything that mentions
Gnome is flamed by KDE bigots, and vice-versa (yes, it does
happen both ways). The interesting thing here is that the
most vocal critics are not developers and/or show clearly
that they do not understand what they are talking about.
Sure, they want someone (who?) to fork GhostScript,
presumably to create a highly productive KDE branch or
something like that. What a bright idea! Sure, they could
get rid of any Bonobo linking, but throwing GdkRgb away
would be stupid.

Sigh! Even if you are careful about what you communicate
think that Raph’s letter
was nice and explained very well that using GdkRgb would
have no influence on KDE), some morons will find a way to
interpret it in a different way.