is there a way to search a linux system for all files owned by a certain user? a user on one of my machines is over his quota, but using less than half of his allocated space in his inbox..
I have 8 VA Linux Systems servers. VA Linux seems to be out of the hardware market now, and I was wondering where I can get hard drive mount kits for these boxes. Does anyone know, or am I S.O.L.?
|cut and paste
I'm giving fedora core 1 a try on my laptop.
It seems to be mostly working, but somehow cut and paste in X seems to be broken. I've got "emulate 3 button mouse" clicked in preferences. I'm also using standard X term (there are times it would be nice to use the more bloated gnome term).
Is there a POP/IMAP distro out there that will report in maillog, not just the number of messages DL'd, but the amount of data DL'd?
|Webcam question ...
Are there any easy to use/install webcam apps out there that'll snap a picture and then send it to a server via FTP? Current Mood: perplexed
I apologize for the off-topic post, just a quickie...
my processor fan is making the most god awful noise, i suspect it's on it's last legs... how much is it going to set me back to replace it?
I thought about posting this to slashdot, but then I learned a little more about it and realized that it's not big enough yet
to warrant that attention.
I'm a consultant at Merck Pharmacudicals.
They just put together a team (aka 2 people) who are evaluating the possibility of moving to a Linux desktop for the company (roughly 50,000 employees/consultants).
this would be a HUGE win for Linux, and I'll keep you all posted about it.
If it gets to be a larger initiative, I'll get all the skivies and post it to slashdot. Current Mood: hopeful
|Server FTP capacity
I've got a server set up on a Pentium 4 2.5ghz system with 512 meg of ram, running Red Hat 9.
Can anyone give me any idea of how many simultaneous ftp connections this server could comfortably handle without dying or slowing down to a crawl? This is assuming that the ftp server is the "major player" on this server, and all other uses (email, web serving, etc) create a negligible impact.
Also, if one makes a file available for download via http as well as ftp, is there any way to limit the number of http connections?
I recently converted a counter application from flatfiles to mysql 4.1.1 and it's considerably reduced my performance. The count.cgi (perl) script gets the referrer information and what not, writes it to a file in /tmp (using a `echo "blah" > /tmp/$$.up` -- I thought this would be the least resource intensive way) , and then a script constantly running and connected to the db inserts the entries and deletes the files (each <200bytes). 'top,' however, shows that it's this count.cgi that's using the most resources, running at 20-50% user per process. I downloaded the sysstats package earlier today, and 'iostat' is showing write sizes of over 100k/sec, in some cases over 500k/sec! Short of the apache log files themselves, and of course the /tmp files, nothing else is being consistently written to the drive.
Any ideas why such a simple cgi script would cause so much CPU usage and/or I/O activity? Any ideas how I can track down these hundreds of kilobytes being written? Any advice at all? :D
1) anyone use xandros linux and care to comment on their experiences with it?
2) if you don't use it, are you considering it?
3) any thoughts on it in general? (w/o starting a my distro tops your distro flame war)
4) wouldn't it be awesome if it were endorsed by olivia newton john, and their slogan was "now we are here... in xandros"