A script for keeping git up to date
Felipe Sateler
fsateler at gmail.com
Thu May 6 03:17:42 UTC 2010
On Wed, May 5, 2010 at 16:09, Reinhard Tartler <siretart at tauware.de> wrote:
> On Mon, May 03, 2010 at 00:55:43 (CEST), Felipe Sateler wrote:
>
>> Ideas on how to make it faster are welcome (somehow finding out at
>> once all the up to date repositories would be good). The script is
>> specially slow in the best-case scenario (when all repositories are up
>> to date). Also, it currently does not abort when a pull was not a fast
>> forward, since gbp-pull does not return non-zero on that event. A bug
>> has been filed against git-buildpackage.
>> Finally, if some package has been removed from the pkg-multimedia
>> area, it will not be removed from the local computer.
>
> What I wonder is what's the correct workflow.
It depends on what do you want to achieve.
> Your script clones ever
> pkg-multimedia git repository.
That's the whole purpose :). I want to be able to look up a package's
source without having to clone it JIT.
>Do you have numbers how much disk space
> this takes?
felipe at pcfelipe:pkg-multimedia% du -s .
1.1G .
However, running git gc over all of them gives a bit of space savings
(small, but probably because most disk space usage is in the checkout,
which can't be compressed).
Just for fun, lets get the 10 piggies out of here:
felipe at pcfelipe:pkg-multimedia% \du -s * | sort -nr | head
120520 vlc
71268 snd
68580 mplayer
58024 audacity
56176 rosegarden
38156 hydrogen-drumkits
33720 ardour
27948 csound
27236 ffmpeg
25160 freewrl
> How much data needs to be transferred, and how long does git
> need for that?
fsateler-guest at alioth:~% du -sh /git/pkg-multimedia
1.1G /git/pkg-multimedia
So there's an upper bound. I didn't measure the actual transfer,
though. I wonder if git gc would make any space gains in there.
>
> Just a brainfart, but perhaps this functionality can be implemented
> better in gbp-clone by adding a concept of group-cloneing?
Probably. But that would require me having to actually think out how
to do this elegantly, and then code it up in python, and I just don't
have time now.
>
> BTW, the list of repos can be probably generated by a cronjob, that way
> also non-team members can test and use your script.
How so? A cron job on alioth with results exported somewhere? Non team
members still get read access. Non alioth users are another matter,
though.
--
Saludos,
Felipe Sateler
More information about the pkg-multimedia-maintainers
mailing list