Cheeeeeeeeese!

Well, Mark finally got his webcam going, completely under Linux.

Earlier today we talked and he was wondering how to get the images up to the server. [1] He didn't want to use FTP as it seems that no one can actually write a version that isn't Swiss Cheese, nor did he want to use scp as that would require him manually typing in a password, or leaving one around in a script on his box.

I offered to write some programs, a client on his end, a server on the server end here that does nothing but copy the image up. Simple enough in theory.

But the details get pretty gory pretty quickly.

But then it hit me—he's running a webserver on his end. Easy enough to have the camera software dump the picture into a web-accessable place on his box, then have the server here use wget to download the image.

The hard part came in configuring Apache. [2]

Problem one: restrict access on his side. Seems to be broken somewhat on his side. Might be a 1.3.3 problem. We're still working on this.

Problem two: Content expiry on the server side. Fixed after some experimentation. In the webpage he has:

>
```
<META HTTP-EQUIV="refresh" CONTENT="60">
```

And in an .htaccess file he has:

>
```
ExpiresActive On
ExpiresDefault A60
ExpiresByType image/jpg A60
```

And that seems to do the proper job.

[1] http://www.conman.org/

[2] https://httpd.apache.org/

Gemini Mention this post

Contact the author