No such user

I wrote one of my first programs for IT. It creates a string that copies to the clipboard. They can paste it into their tickets. Later, a macro automates the report that collects these perfectly-typed strings.

IT folks poke at software for fun, and I should have expected it. “BUG,” read an email, detailing a trivial way to mess up the process. That led to a hundred extra lines to force a sequential process, iterations across JButton arrays, setting enabled(false), and checking global Boolean states.

The first hour was spent futzing with ButtonModel and selection calls. The next was spent placing the groundwork for a procedural solution. The logic to that took half an hour and zero documentation.

Keep a trace log

One big thing I learned from the C# port was the concept of logging: write output to a file; have a common function write_log(“something”) that creates a plaintext file nearby. This can be read, mailed, filtered and archived. A lot of times – gdb notwithstanding – I have to reproduce bugs. Having a trace log helps me narrow the problem to the last set of lines.

proc write_stamped_log {msg} {
  set log_path [file join [file dirname [info script] err.log]]
  set fh [open $log_path a]; # append
  puts $fh $msg
  close $fh

Customer requirements

I rolled up a TCP client-server and put out the alpha. I searched “single-threaded server” and thought about the I/O-blocking horror. If we set up all the machines beforehand, we could avoid that bottleneck. I thought about fast array lookups and cached indexing, logarithmic runtimes and great success.

“You shouldn’t make a form without first finding out what we want it to look like and what we want out of it,” the manager said.

So much for letting loose. We were explorers in a warehouse jungle, girder-shelves, cardboard trees. Our connections were spotty; we kept the control file in our electronic bibles and kept the scanners unsheathed.

I thought about our reporting process. We can’t do realtime. But we can do realtime extrapolation. If I can retrieve the data in 15-minute intervals, process it up in the between latent, run a linear regression and spit out a .png from R, and then mail it – bam: steaming-fresh, friendly, twice-daily works of art!