Friday 15 August 2008

crazy idea: user supported snapshot.debian.net

I just had this crazy idea about solving snapshot.debian.net's space issue which just might work if done right.

We all know that snapshot.debian.net has ran out of space in early May this year and we all lost a valuable service. Since there are probably many users, developers and maintainers which find this service useful, it might make sense to think that the users themselves could solve the problem.

How? Think about implementing a really distributed storage/file system (not necesarly a real file system) that can rely on cluster nodes that can enter or exit at any time and the nodes are contributed by users, very much alike the GoogleFS file system (or maybe clients in a torrent swarm).


First problems that I can already see:
  • there must be some indexing service that should run 24x7 to manage the information about the blocks which are desired.
  • some data should always be kept on a trusted machine - the gpg signatures, the Packages files indexing information and meta data
  • there might be a security risk involved since data would be stored on untrusted machines, but a sha1/md5 check in colaboration with the gpg signature should be enough (which isn't always doable - see large packages like openoffice.org or game data packages); if there are reasons why this might not be secure, maybe we need to rethink a little about the dpkg-sig package and see if we can get all the packages in the archive to be individually signed when entering the archive, or something of that sort
  • the meta-data of the FS cluster might be really big, but my gut feeling tells me that it won't be as big as the current snapshot.d.n storage
  • when a file is requested, the file needs to be cached on a central server so it can be assembled and its checksum be verified before being delivered to the client requesting it
  • people might donate really little space compared to their usage
  • some nodes of the network are special and this could lead to the failure of the entire infrastructure in case the special nodes fail; some of these nodes might turn out to need really big muscles
  • some data might be unavailable at times depending on the clients connected to the network at different points in time
  • in an attempt to create more copies of a block in more nodes in the network, a DoS might be triggered if many clients request the same info from a (set of) node(s) containing the rare info - still torrent's way of workig seems to cope quite well with that model
  • none of this is implemented and probably I won't do it myself, although it could be a really nice project, even it is doomed to fail from day 1

Other ideas:
  • maybe it would make more sense for the FS to actually be an ever-growing torrentt which people can connect to via some special client which would allow pushing to the clients in order to store information
  • the number of copies of a file (or chunk of a file) in the distributed storage should be higher for more recent packages and lower for older ones (still one copy might be necessary to be stored on safe nodes - aka controled and trusted by debian so the information doesn't get lost - back to our current storage problem)
  • probably the entire thing could be built somehow by piggy-backing the current apt-transport-debtorrent implementation or debtorrent, or apt's cache - the idea is that although not all files might be available, some people might still have relatiely recent packages files in their apt cache
  • such a system, if done right, might prove to be usable as a live backup system between different nodes in the cluser, so it could be used in other situations by individuals, without the need of a central server - think trackerless trorrents
What do other people think?

Thursday 14 August 2008

vga=what in lilo.conf for 1280x800 1440x1024

With the more frequent wide screen laptops using 1280x800 or something else like 1440x1280, most people would want the console in vesa mode to use the entire screen, not only 1024x768.

I was in that situation and wanted to know the "secret code" that was needed for that console mode since these modes for 1280x800 seem to be vendor specific and not a VESA standard. They are not present in the official documentation but can be found with a simple command:

hwinfo --framebuffer | grep 1280x800

and pick one of those that makes sense for you. Add that to lilo.conf, run lilo (run etckeeper commit, if you use etckeeper) and reboot. Enjoy.

migrating from svn to git for wormux's maintainance

I have been thinking for quite some time about migrating the maintenance of the wormux package to git, but in a really smart way so that:
  • we preserve history currently stored in svn
  • a new integration branch should exist - we used the incomplete package model with svn-buildpackage, but now it makes more sense to have the complete history, including the upstream code at the time
  • all previous orig tarballs should exist in the git repo in pristine-tar format
  • upstream's svn repo should be included entirely so that we can cherry-pick any patches we might want or need
Since both the package repo and the upstream repo are in SVN, you can't simply use git-svn for both in a single repo, so I started thinking of some scheme to be able to join both repos and the (uncommitted) orig tarballs (which are backed up on DGT's webspace).


Tonight I have experimented a little with my git-svn clones which I made previously and eventually thought that:
  • the main repo should track the upstream repo via git-svn
  • I should properly tag the debian package versions in the svn.d.o repo, so svn tags no longer look like branches in git
  • ideally all the branches in the package repo should be reproduced in some corresponding branch in the final repo
  • the integration branch (that should contain the real debian complete packages) should be created with a non-existant merge strategy which I call overlay (basically, for two branches A and B, the strategy resumes to: all files as in A are present, any files unique to B are present)
  • using pristine-tar forces me to keep the 'upstream' branch reserved so I need to avoid using that

It seems the last point could be obtained in some way via this small set of commands (I am sure it needs some modification to make the integration work in another branch C):

git merge --no-commit -s ours B
git checkout B -- .
git checkout A -- .
git commit


Note: Looking back, I think I can do a better job next time and I will se if I can use the resulting repo to create a new repo and drop the svn package maintainance for wormux. If this works, I might be able to create a script to allow migration of svn-bp maintained packages to git, at least for the Debian Games Team.


Since you can't fix a problem if you don't take things one at a time, I started by cloning the initial git-svn generated repos.

So after waiting some time (both git-svn fetch commands took long enough), I had to git-svn repos:
  • repo A - the package repo
  • repo B - the upstream svn clone repo
I tried to clone both A and B, to avoid distroying the original A and B clones and noticed that the svn branches were not visible from the cloned repo since the origins didn't had any other local branches than master - the svn trunk for each of them. So I added tracking branches in A for the experimental and sarge versions of the package. I also added some tags in A so that the svn tags don't get lost in between.

In the end I got my A' repo that had:

0 eddy@heidi ~/usr/src/wormux/tt/debian-bare $ git br -a
debian-experimental
debian-sarge
debian-sid
* master
origin/debian-experimental
origin/debian-sarge
origin/master


Yes, and having master in A' is not a good idea, but since I made A' a bare repo I had to suffer. Well, one more thing to better in iteration 2.


Excellent, now I can go in B and add the new remote repo which is A' which I should have called pkgsvn, but I ended up call it 'origin'. Since git-svn doesn't treat the svn repo as a true remote, this was possible.



Due to some confusion I also added 'ups-' as a prefix to the svn remote trunk and tags, but I am still unsure if this was necessary - I'll have to see more clearly into the pristine-tar restrictions area.

After all of this I got this branch configuration:
0 eddy@heidi ~/usr/src/wormux/tt/git-full $ git branch
* master
0 eddy@heidi ~/usr/src/wormux/tt/git-full $ git branch -r
0.8-final
0.8beta4
origin/debian-experimental
origin/debian-sarge
origin/debian-sid
origin/master
ups-tags/wormux-0.7
ups-tags/wormux-0.7.2
ups-tags/wormux-0.7.3
ups-tags/wormux-0.7.4
ups-tags/wormux-0.7.9
ups-tags/wormux-0.7.9rc1
ups-tags/wormux-0.7beta3
ups-tags/wormux-0.8
ups-tags/wormux-0.8alpha1
ups-tags/wormux-0.8beta1
ups-tags/wormux-0.8beta2
ups-tags/wormux-0.8beta3
ups-tags/wormux-0.8beta4
ups-trunk
wormux-0.7
wormux-0.7.9
wormux-0.8beta1
wormux-0.8beta2


So more tracking branches were necessary; I fired 'gitk --all' and got a nice picture, after adding tracking branches for branches originating from A':


This is not yet complete and now I am feeling sleepy, so I'll continue once I experiment more.

Monday 11 August 2008

git-core backport still useless

The versioned depends on tk8.5 has been removed in 1:1.5.6.3-1~bpo40+4, but still, this isn't enough:

$ gitk
/usr/bin/gitk: line 3: /usr/bin/wish8.5: No such file or directory
/usr/bin/gitk: line 3: exec: /usr/bin/wish8.5: cannot execute: No such file or directory


The irony is that you can simply workaround with:

# ln -sf /usr/bin/wish8.4 /usr/bin/wish8.5


But the proper fix would be to apply this patch:


diff --git a/debian/rules b/debian/rules
index 7a1e771..7bd25cd 100755
--- a/debian/rules
+++ b/debian/rules
@@ -7,7 +7,7 @@ CFLAGS =-g -Wall
STRIP =strip
TEST =test
OPTS =NO_OPENSSL=1 prefix=/usr mandir=/usr/share/man INSTALLDIRS=vendor \
- WITH_P4IMPORT=1 PYTHON_PATH=/usr/bin/python TCLTK_PATH=/usr/bin/wish8.5 \
+ WITH_P4IMPORT=1 PYTHON_PATH=/usr/bin/python TCLTK_PATH=/usr/bin/wish \
THREADED_DELTA_SEARCH=1

ifneq (,$(findstring nostrip,$(DEB_BUILD_OPTIONS)))
--
1.5.6.3

Tuesday 5 August 2008

git-core etch backport is useless

Update:

It seems that the lenny version (1:1.5.6.3-1) plus some small changes to allow building and running on etch does not have this problem.

The changes include:

* backported to etch
- compile against tcl8.4, instead of tcl8.5
- gtik and git-gui depend on tk8.4, instead of tk8.5
- drop the versioned dep on asciidoc and docbook-xsl


It s really nice that git-core is backported to etch, but is kind of hard to convince people to use it when gitk needs tk8.5, which is unavailable in etch.

This was the reason why previosely I made a local backport (1:1.5.6-1~bpo40+1~local) thinking the missing/extra[*] dep was a temporary glitch. According to bug 456423, this should be safe.


Does backports.org have a buildd network? IIRC, it uses the experimental buildd network, but that doesn't explain the desynchronisation of the packages in etch-backports.


Oddly enough, the build now fails on my etch machine during the tests in t9400-git-cvsserver-server.sh (looks like it is not related to the tk8.5 change):

* FAIL 23: cvs update (create new file)
echo testfile1 >testfile1 &&
git add testfile1 &&
git commit -q -m "Add testfile1" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test "$(echo $(grep testfile1 CVS/Entries|cut -d/ -f2,3,5))" = "testfile1/1.1/" &&
diff -q testfile1 ../testfile1

* expecting success: echo line 2 >>testfile1 &&
git add testfile1 &&
git commit -q -m "Append to testfile1" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test "$(echo $(grep testfile1 CVS/Entries|cut -d/ -f2,3,5))" = "testfile1/1.2/" &&
diff -q testfile1 ../testfile1
Counting objects: 5, done.
Compressing objects: 100% (2/2), done.
Writing objects: 100% (3/3), 282 bytes, done.
Total 3 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
To gitcvs.git
0879984..e6bd37b master -> master
cvs update: Updating .
cvs update: New directory `master'
* FAIL 24: cvs update (update existing file)
echo line 2 >>testfile1 &&
git add testfile1 &&
git commit -q -m "Append to testfile1" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test "$(echo $(grep testfile1 CVS/Entries|cut -d/ -f2,3,5))" = "testfile1/1.2/" &&
diff -q testfile1 ../testfile1

* checking known breakage:
mkdir test &&
echo >test/empty &&
git add test &&
git commit -q -m "Single Subdirectory" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test ! -d test

Counting objects: 4, done.
Compressing objects: 100% (2/2), done.
Writing objects: 100% (3/3), 388 bytes, done.
Total 3 (delta 0), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
To gitcvs.git
e6bd37b..4a13ee9 master -> master
cvs update: Updating .
cvs update: New directory `master'
* FIXED 25: cvs update w/o -d doesn't create subdir (TODO)

* expecting success: (for dir in A A/B A/B/C A/D E; do
mkdir $dir &&
echo "test file in $dir" >"$dir/file_in_$(echo $dir|sed -e "s#/# #g")" &&
git add $dir;
done) &&
git commit -q -m "deep sub directory structure" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update -d &&
(for dir in A A/B A/B/C A/D E; do
filename="file_in_$(echo $dir|sed -e "s#/# #g")" &&
if test "$(echo $(grep -v ^D $dir/CVS/Entries|cut -d/ -f2,3,5))" = "$filename/1.1/" &&
diff -q "$dir/$filename" "../$dir/$filename"; then
:
else
echo >failure
fi
done) &&
test ! -f failure
Counting objects: 13, done.
Compressing objects: 100% (4/4), done.
Writing objects: 100% (12/12), 754 bytes, done.
Total 12 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (12/12), done.
To gitcvs.git
4a13ee9..f573218 master -> master
cvs update: Updating .
cvs update: New directory `master'
grep: A/CVS/Entries: No such file or directory
grep: A/B/CVS/Entries: No such file or directory
grep: A/B/C/CVS/Entries: No such file or directory
grep: A/D/CVS/Entries: No such file or directory
grep: E/CVS/Entries: No such file or directory
* FAIL 26: cvs update (subdirectories)
(for dir in A A/B A/B/C A/D E; do
mkdir $dir &&
echo "test file in $dir" >"$dir/file_in_$(echo $dir|sed -e "s#/# #g")" &&
git add $dir;
done) &&
git commit -q -m "deep sub directory structure" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update -d &&
(for dir in A A/B A/B/C A/D E; do
filename="file_in_$(echo $dir|sed -e "s#/# #g")" &&
if test "$(echo $(grep -v ^D $dir/CVS/Entries|cut -d/ -f2,3,5))" = "$filename/1.1/" &&
diff -q "$dir/$filename" "../$dir/$filename"; then
:
else
echo >failure
fi
done) &&
test ! -f failure

* expecting success: git rm testfile1 &&
git commit -q -m "Remove testfile1" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test -z "$(grep testfile1 CVS/Entries)" &&
test ! -f testfile1
rm 'testfile1'
Counting objects: 3, done.
Compressing objects: 100% (2/2), done.
Writing objects: 100% (2/2), 233 bytes, done.
Total 2 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (2/2), done.
To gitcvs.git
f573218..2ed3834 master -> master
cvs update: Updating .
cvs update: New directory `master'
* ok 27: cvs update (delete file)

* expecting success: echo readded testfile >testfile1 &&
git add testfile1 &&
git commit -q -m "Re-Add testfile1" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test "$(echo $(grep testfile1 CVS/Entries|cut -d/ -f2,3,5))" = "testfile1/1.4/" &&
diff -q testfile1 ../testfile1
Counting objects: 4, done.
Compressing objects: 100% (2/2), done.
Writing objects: 100% (3/3), 300 bytes, done.
Total 3 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
To gitcvs.git
2ed3834..3f3e927 master -> master
cvs update: Updating .
cvs update: New directory `master'
* FAIL 28: cvs update (re-add deleted file)
echo readded testfile >testfile1 &&
git add testfile1 &&
git commit -q -m "Re-Add testfile1" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test "$(echo $(grep testfile1 CVS/Entries|cut -d/ -f2,3,5))" = "testfile1/1.4/" &&
diff -q testfile1 ../testfile1

* expecting success: echo Line 0 >expected &&
for i in 1 2 3 4 5 6 7
do
echo Line $i >>merge
echo Line $i >>expected
done &&
echo Line 8 >>expected &&
git add merge &&
git commit -q -m "Merge test (pre-merge)" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test "$(echo $(grep merge CVS/Entries|cut -d/ -f2,3,5))" = "merge/1.1/" &&
diff -q merge ../merge &&
( echo Line 0; cat merge ) >merge.tmp &&
mv merge.tmp merge &&
cd "$WORKDIR" &&
echo Line 8 >>merge &&
git add merge &&
git commit -q -m "Merge test (merge)" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
sleep 1 && touch merge &&
GIT_CONFIG="$git_config" cvs -Q update &&
diff -q merge ../expected
Counting objects: 4, done.
Compressing objects: 100% (2/2), done.
Writing objects: 100% (3/3), 302 bytes, done.
Total 3 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
To gitcvs.git
3f3e927..28c1b4a master -> master
cvs update: Updating .
cvs update: New directory `master'
* FAIL 29: cvs update (merge)
echo Line 0 >expected &&
for i in 1 2 3 4 5 6 7
do
echo Line $i >>merge
echo Line $i >>expected
done &&
echo Line 8 >>expected &&
git add merge &&
git commit -q -m "Merge test (pre-merge)" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
test "$(echo $(grep merge CVS/Entries|cut -d/ -f2,3,5))" = "merge/1.1/" &&
diff -q merge ../merge &&
( echo Line 0; cat merge ) >merge.tmp &&
mv merge.tmp merge &&
cd "$WORKDIR" &&
echo Line 8 >>merge &&
git add merge &&
git commit -q -m "Merge test (merge)" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
sleep 1 && touch merge &&
GIT_CONFIG="$git_config" cvs -Q update &&
diff -q merge ../expected

* expecting success: ( echo LINE 0; cat merge ) >merge.tmp &&
mv merge.tmp merge &&
git add merge &&
git commit -q -m "Merge test (conflict)" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
diff -q merge ../expected.C
Counting objects: 5, done.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 303 bytes, done.
Total 3 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
To gitcvs.git
28c1b4a..1f0f9c8 master -> master
cvs update: Updating .
cvs update: New directory `master'
diff: merge: No such file or directory
* FAIL 30: cvs update (conflict merge)
( echo LINE 0; cat merge ) >merge.tmp &&
mv merge.tmp merge &&
git add merge &&
git commit -q -m "Merge test (conflict)" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update &&
diff -q merge ../expected.C

* expecting success: cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update -C &&
diff -q merge ../merge
cvs update: Updating .
cvs update: New directory `master'
diff: merge: No such file or directory
* FAIL 31: cvs update (-C)
cd cvswork &&
GIT_CONFIG="$git_config" cvs -Q update -C &&
diff -q merge ../merge

* expecting success: echo Line 9 >>merge &&
cp merge cvswork/merge &&
git add merge &&
git commit -q -m "Merge test (no-op)" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
sleep 1 && touch merge &&
GIT_CONFIG="$git_config" cvs -Q update &&
diff -q merge ../merge
Counting objects: 5, done.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 303 bytes, done.
Total 3 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
To gitcvs.git
1f0f9c8..e6c5fc1 master -> master
cvs update: Updating .
cvs update: New directory `master'
* ok 32: cvs update (merge no-op)

* expecting success:
touch really-empty &&
echo Line 1 > no-lf &&
echo -n Line 2 >> no-lf &&
git add really-empty no-lf &&
git commit -q -m "Update -p test" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs update &&
rm -f failures &&
for i in merge no-lf empty really-empty; do
GIT_CONFIG="$git_config" cvs update -p "$i" >$i.out
diff $i.out ../$i >>failures 2>&1
done &&
test -z "$(cat failures)"

Counting objects: 4, done.
Compressing objects: 100% (2/2), done.
Writing objects: 100% (3/3), 305 bytes, done.
Total 3 (delta 1), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
To gitcvs.git
e6c5fc1..4d0f85b master -> master
cvs update: Updating .
cvs update: New directory `master'
cvs update: Updating .
cvs update: New directory `master'
cvs update: Updating .
cvs update: New directory `master'
cvs update: Updating .
cvs update: New directory `master'
cvs update: Updating .
cvs update: New directory `master'
* FAIL 33: cvs update (-p)

touch really-empty &&
echo Line 1 > no-lf &&
echo -n Line 2 >> no-lf &&
git add really-empty no-lf &&
git commit -q -m "Update -p test" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs update &&
rm -f failures &&
for i in merge no-lf empty really-empty; do
GIT_CONFIG="$git_config" cvs update -p "$i" >$i.out
diff $i.out ../$i >>failures 2>&1
done &&
test -z "$(cat failures)"


* expecting success:
mkdir status.dir &&
echo Line > status.dir/status.file &&
echo Line > status.file &&
git add status.dir status.file &&
git commit -q -m "Status test" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs update &&
GIT_CONFIG="$git_config" cvs status | grep "^File: status.file" >../out &&
test $(wc -l <../out) = 2 Counting objects: 5, done. Compressing objects: 100% (2/2), done. Writing objects: 100% (4/4), 366 bytes, done. Total 4 (delta 1), reused 0 (delta 0) Unpacking objects: 100% (4/4), done. To gitcvs.git 4d0f85b..a146cf3 master -> master
cvs update: Updating .
cvs update: New directory `master'
Invalid module '' at /home/eddy/usr/src/tools/git-backports/1.5.6.3-1~bpo40+3~local/git-core-1.5.6.3/t/../git-cvsserver line 2895,
line 18.
cvs [status aborted]: end of file from server (consult above messages if any)
* FAIL 34: cvs status

mkdir status.dir &&
echo Line > status.dir/status.file &&
echo Line > status.file &&
git add status.dir status.file &&
git commit -q -m "Status test" &&
git push gitcvs.git >/dev/null &&
cd cvswork &&
GIT_CONFIG="$git_config" cvs update &&
GIT_CONFIG="$git_config" cvs status | grep "^File: status.file" >../out &&
test $(wc -l <../out) = 2 * expecting success: cd cvswork && GIT_CONFIG="$git_config" cvs status -l | grep "^File: status.file" >../out &&
test $(wc -l <../out) = 1 Invalid module '' at /home/eddy/usr/src/tools/git-backports/1.5.6.3-1~bpo40+3~local/git-core-1.5.6.3/t/../git-cvsserver line 2895,
line 19.
cvs [status aborted]: end of file from server (consult above messages if any)
* FAIL 35: cvs status (nonrecursive)

cd cvswork &&
GIT_CONFIG="$git_config" cvs status -l | grep "^File: status.file" >../out &&
test $(wc -l <../out) = 1 * expecting success: cd cvswork && GIT_CONFIG="$git_config" cvs status | grep ^File: >../out &&
! grep / <../out Invalid module '' at /home/eddy/usr/src/tools/git-backports/1.5.6.3-1~bpo40+3~local/git-core-1.5.6.3/t/../git-cvsserver line 2895,
line 18.
cvs [status aborted]: end of file from server (consult above messages if any)
* FAIL 36: cvs status (no subdirs in header)

cd cvswork &&
GIT_CONFIG="$git_config" cvs status | grep ^File: >../out &&
! grep / <../out * fixed 1 known breakage(s) * failed 11 among 36 test(s) make[2]: *** [t9400-git-cvsserver-server.sh] Error 1 make[2]: Leaving directory `/home/eddy/usr/src/tools/git-backports/1.5.6.3-1~bpo40+3~local/git-core-1.5.6.3/t' make[1]: *** [test] Error 2 make[1]: Leaving directory `/home/eddy/usr/src/tools/git-backports/1.5.6.3-1~bpo40+3~local/git-core-1.5.6.3' make: *** [build-arch-stamp] Error 2



[*] depends on how you look at it

Friday 1 August 2008

Recommending the MSI MegaBook PR200WX-058EU laptop to Linux users

I recently bought a new MSI laptop and I am really pleased with my choice so far.

Since with the previous laptop I managed to kill two of my desired features, long battery life and pretty portable, I decided is time to look really well and see what the market has to offer.

I settled on a MSI PR200WX-058EU which has the following:
  • built on a Centrino platform
    • Intel(R) Core(TM)2 Duo CPU T8300 @ 2.40GHz
    • Intel Corporation PRO/Wireless 4965 AG or AGN
    • Intel mobile chipset
  • 3GB of memory (yes, I would have enjoyed 4, but it seems the 32 bit barrier still decides hardware configurations)
  • 320GB HDD
  • DVD+/-RW
  • Mobile GM965/GL960 Integrated Graphics Controller
  • 12" wide screen (1280x800)
  • 1 Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet
  • 8 cell battery by default (long battery life)
  • 1 USB 2.0 Camera - uses luvc driver
  • bluetooth
  • 1 card reader
  • 3 USB port, 1 HDMI, 1 D-sub 15 connector, 1 PCI Express port, modem port
  • fingerprint reader
  • an advertised weight of 1.8kg (I weighted it at 2.1kg with the 8 cell battery)

In other words a small and mobile powerhouse for which I payed 3800 RON (approx. 1100€). I'd say not too bad at all.

Thanks to Gonéri Le Bouder and some searches on the internet I concluded that
the brand is not bad at all, and now that I have it I really am sure.

Of course, I installed Debian Lenny on it (I also sent an installation report), and, in spite of the initial problems, I managed to make the laptop work pretty nice, but I am especially excited about the webcam, which works with the linux-uvc driver.


There were some issues, but I managed to fix some them while I have been ignoring some other. The LaptopTestingTeam page for MSI PR200 on the ubuntu wiki was very helpful.
What have I been ignoring?
  • it seems that from time to time the battery charge status isincorrect, or acpi report that the AC is plugged in
  • headphones don't automatically turn off the speakers
  • fingerprint reader is not used yet, but I intent to experiment at some point with it (authentication via the fingerprint would be cool)
  • the keyboard seems a little bit too hard (but I hope it will loose up with time)
  • modem isn't probably working, but I don't think I'll ever try
  • there are some issues in gnome-power-manager which cause strange behaviour and grief wrt screen brighness
  • sleep doesn't work,but hibernation does



What would you do if I'd tell you that the battery lasts 4 hours or even more while the wlan is on and working (browsing and stuff like installing new packages, configs) with the brightness set to minimum?


I am really excited about this and I can say that I think I have found my next generation laptop, so if you're thinking of a cheap mobile powerhouse on which Linux must run, this laptop might be for you.

I find the following to be selling points (for a Linux buyer or others):
  • comes with FreeDOS, so no extra money for an OS you don't use
  • wifi works
  • webcam works with luvc
  • no big problems during install (the most severe are already fixed)
  • long battery life, even when using wifi (4+ hours in browsing+some package installation mode, according to my tests)
  • light enough to carry around easily (2.1kg, 2.4kg with charger)
  • nice design - some people asked me if I bought a MacBook
  • 4 state kill switch for all possible combinations for wifi/bluetooth
  • hibernate works with 2.6.24 or newer (sleep doesn't, but I will try a few things later)
  • there's no mechanical latch for the lid, there is a magnet, so there's no plastic to be broken, the laptop is more robust
  • the 8 cell battery is thicker than the regular one, allowing better ventilation
  • it seems it doesn't get too hot to put on my lap (still, I am a little afraid of blocking its ventilation due to the fabric of my clothes, so I'll probably try to carry around a hard paperboard or something like that)
  • the big resolution 1280x800 (for this screen size) seems to partly compensate for the reduced physical size of the screen (12")
  • bright screen

So, thanks again to Gonéri, Debian Installer team, LVM2 maintainers, Ubuntu Laptop Testing team, linux uvc developers, Lilo developers and maintainers, all the nice people who made and still make Debian possible.