Thursday, May 10, 2007

Snagdar Lives!

A long time ago I wrote a little script called that simplified fetching Darwin source. It broke when Open Darwin went away and Apple started requiring your ADC login to download some sources.

I finally got around to updating the script so that it now works with the current Darwin source at However, since it now requires your ADC login name and password, you must create a ~/.snagdarpass file with this information.

Let's see how it works from scratch. We first need to get You can download it from here, or you can get it by doing the following.

$ curl -s >
$ chmod +x

Now let's use it to get the source for the kernel.

$ ./ xnu
./ line 46: /Users/jgm/.snagdarpass: No such file or directory
ERROR: no username and password found

Yikes! Oh, right, snagdar now needs a valid ADC login and password so we need to add them to the file ~/.snagdarpass.

$ echo "username=someone" > ~/.snagdarpass
$ echo "password=something" >> ~/.snagdarpass
$ chmod 0600 ~/.snagdarpass

Now we should be able to fetch all the source we want.

$ ./ xnu

+++++ Snagging
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 6609k 100 6609k 0 0 1588k 0 0:00:04 0:00:04 --:--:-- 2581k

And if you want to fetch all the Darwin sources, you can tell snagdar to fetch the regular expression ., as in the following example.
$ ./ .


That's pretty much it. See the original post for more details.

Also, thanks to weltonch777's post at for showing how to authenticate with Apple and store the cookie file using curl.


Grady Haynes said...

Thanks, man, that rocks!

leeg said...

Now you just need to have it untar all the files, put them into SVN and have OpenGrok provide a searchable UI, like I did when I had a server I could do that on :-(. I used darwinxref for the grabbing though, and I guess that isn't maintained any more.

Unique Directories said...

Interesting keep up the good work for us providing very good articles that is with good quality

vivacarlie said...

I want to put together my own gnu/xnu system. this will be like linux from scratch. Where do i start?

Raymond W said...

It seems you need


in the final curl.

curl -b $cookie_file $dl_url | tar zxf -

curl --location-trusted -b $cookie_file $dl_url | tar zxf -

The site is 302ing to a new local, and if you do not let curl follow the redirect & let your u/p travel with the redirect, the DL does not happen

Anonymous said...

How about curl -L ... ?