Perl lock file while updating
You have a few options, in growing order of complexity: 1) Just time-and-datestamp every line.When you need to examine the coalesced file, you interleave all the input files.For extra safety you can make your logging scripts create new log file every period of time (say 5 minutes) , and make your daemon ignore the files that are younger than five minutes. using Net:: Daemon or similar, which handles the writing of log entries in a central manner.The CGI script instances would pass the log strings to this daemon through a socket.
Could any of you help me to improve the above scenario (updating CSV for all 2000 PERL scripts)? Please correct me, if I'm understanding your intention wrongly.
Three basic file handles are - STDIN, STDOUT, and STDERR, which represent standard input, standard output and standard error devices respectively.
There are following two functions with multiple forms, which can be used to open any new or existing file in Perl.
2) Write a script that keeps running all the time that holds all filehandles open and, using select() finds files with new data and dumps it to the output in the order it received it.
This method could become a resource hog, as it would be constantly calling select, then looking for new files, then opening the new files, then calling select again. If you ever end up in a situation where the loggers can have more log files open than a process in your operating system can support at a time, you're back to solution number 1.