Buffer-wrapped lib calls

stdio.h has getchar() and putchar(), and I wrap these in functions that utilize an external buffer, an array tracked by an index. So getchar() blocks for input and the wrapping function – I named it getch() – returns one character at a time from the buffer.

What if getchar() already uses an internal buffer? Haha: then I am buffering a buffered standard library function call. My excuse would have to be modifying the characters fetched from the input stream en route, i.e. conforming to a Fortran-type fixed-length format, where each row has padded blanks at the end.

Hopefully these little tribulations pay later on, when the design “pattern” of buffers and blocked-waiting gets applied to video encoding, multiplexing, threaded applications. I’ll look back on the humble getchar()/putchar() and realize that I spend more time thinking than writing code. And that’s okay.

putchar(0)

I went back to the first chapter in “Software Tools” and looked at getc() and putc() again. The Fortran sample code assumes fixed spaces, so every line is padded with blanks. This is non-obvious going into Chapter 2. I wondered why we needed a program like entab to replace blanks with tabs and a program (detab) to substitute them.

If we are printing output to a line printer, the extra blanks cause the line printer to stutter, locking the carriage into a forward-pause-forward motion. It would be slower than traversing tabs (\t), where the printer head could advance smoothly. (Actually, I have never used a line printer, but I assume this.)

One of the last exercises in Chapter 1 is to modify the included getc() and putc() code to handle “variable-length” strings. With the benefit of hindsight, I know in K&R C that strings are NUL-terminated, i.e. “hello”. This simplifies putc() because we don’t have to track an index or insert blank padding before output. (To simplify means less code.)

The tricky part is the string buffer: putchar() is called on the NUL character, i.e. zero (0). Is this okay? Visually, nothing gets printed. I do not know enough to know better, so I will keep it for now.

I am also using a character buffer, not the typical integer buffer. As far as possible, I am trying to stick to the sample code’s intent. That led me to a philosophical conflict of handling EOF, which is outside the range of any normal character: EOF is -1, while visible characters are all positive, non-zero numbers. I cannot store a -1 in an array whose elements can only be from 0 to 254.

This is noted in the putc() and getc() examples by the parameter: a character is passed in. For the C port, I use a reference. So the function returns an int to signal EOF or a character’s ASCII value (0 to 254), but the real character is assigned like this:

int getch(char* c)
{
  ...
  if (eoff == EOF) {
    *c = 0; /* non-printing character */
    return EOF;
  } else {
    *c = buf[lastc];
    return *c;
  }
}

char c = EOF

In “Software Tools,” the Fortran implementation of getc(c) assigns the next character in the input stream to the parameter c and returns an integer value that may or may not be EOF. It’s fine until I get to this statement:

...
c = EOF
...

Printing that out with putchar() in a small C program results in this letter: ÿ

No! Okay, now we go to “Elements of Programming Style” and look for the READCH() subroutine:

...
C END OF FILE
  90 READCH = NO
     RETURN
     END

Absent errata, I will ignore the assignment of the EOF (-1) constant to an (unsigned) char, which only accepts nonnegative values from 0 to 254. The sticking point was to use character arrays rather than integer arrays to represent strings.

Electronic epiphany in the birthwhile saga

Teaching people how to use a computer is a noble effort, but the only time any learning takes place is during clutch sessions. The user only learns what is minimally needed to accomplish a specific task and will not expend any effort to experiment. Beyond the goal, the computer is only a means to it, a utility. Only with enough tasks accomplished does an interaction pattern emerge, and the user is comfortable navigating the filesystem, sending emails with attachments, and manipulating files.

These people drive sales. They are ignored at your peril. Yet how do you achieve a profit with individual lessons? That divides your time with the same concepts. That leaves few options, but there is one: automation.

Desktop-level manipulations, unlike videos, put the user in the here-and-now. In fact, it’s even a little dangerous: the wrong window handle, the misplaced file selection, and poof – something is deleted or worse. Hey, wait – that’s experimental! If the user won’t play, bring the game to him.

Legos and instructions

Reading the manual pages for sh, I thought to write a small script for each concept. It was difficult in a chicken-and-egg sort of way: how do you use if statements “before” you learn it? Maybe jumping around is a better approach, because at least you’re context-driven.

The man pages are organized into sections based on a reference layout; a project-oriented focus would prompt a different perusal strategy. And is writing little throwaway scripts the best way to retain knowledge? It also depends on experience, the sheer amount of typed stuff that grows in the folder.

Composability and emergence clash here: I can’t “make up” situations in which the concept validates itself. There needs to be some upper-layer driving force, a problem that lends itself to reducibility. Software is all about deconstruction, about mini-solutions. But I expect code to represent conceptual retention; can I have my cake and eat it too?

No, I don’t think so – at least, to entertain that line of thought: suppose there must exist a notion of a design to solve some problem, and data serves the design. If your program serves the data – the data structure(s) – then it’s progress. You iterate on the design; you iterate on the software; you push and commit.

I learned batch scripting to use cURL. I learned AutoIt to expand beyond VBA. I learned Tcl/Tk to bridge platforms. I want to learn sh to manage co-processes.

Unless my goal is to definitively annotate the sh(1) man page, the better approach is to try out <& and gradually add concepts around it.

Dinosaur in the dollhouse

I like Visual Studio a lot. Terminal is a “6px font,” which in VS 2008 is sharp and small. It’s legible and lets me see a ton of code at a glance. Breakpoint debugging is pretty much my bread and butter.

A decade ago, I was indoctrinated by free software. Proprietary justifications behind solutions were a mystery to me. In some ways, it remains so. All these concepts given function to be of use to others layered behind monetization and marketing. And those same people using resources available online, contributed freely, for their own gain, would seem like roosting vultures.

Maybe that same vague, intangible faith in free software – even so the willingness to remain befuddled by the motivation of market forces – has kept me locked in a certain mindset, keeping me from grasping greater layered abstractions. For me, everything ought to be small and composed, just so, but in reality large, complex systems necessarily are reinvented out of haste rather than craft. (Or, I do not believe any meaningful project could come out of exercising others’ APIs.)

It is one thing to claim humility by crediting past laborers giants if you are the inventor of the calculus, I guess, and another to sit and think, “How can I properly contribute to all the amazing work that’s already been done, in a manner that produces a useful output, without feeling like I must focus on fundamentals?” Because I would very much like to embrace Java and C# and Windows internals and learning random StackOverflow questions.

Somehow, I’ve convinced myself that the only path to C is through old books, that UNIX is the last stop of our computing architecture, and nothing is more important than the commandline. This atop a society progressing toward mobile UIs, portable security and ubiquitous networks.

As a youth, I should be rebelling against the old. Intuition demands otherwise. Sometimes I think it was easier when I only cared about games, novels and myths: it was pure oblivion and not so torturous. Without them, I crash headlong into a lifetime of technical upkeep and constant (corporate) validation.

I wonder if I would have taken up programming at all if I had known love of it wasn’t enough.

Wifi with athn0

This is how OpenBSD 5.5 configures the wireless for athn0, /etc/hostname.athn0:

wpid ROUTER
wpakey '$@fF3'
dhcp

Simple, no? The hardest part was single quotes around the WPA key. Actually, there was one other thing: the firmware needed to be installed, so an Ethernet connection had to be set up. In retrospect, the following were futile:

  • sh /etc/netstart followed by prayer
  • ifconfig followed by concentration of latent psychic energy to force status to *active*
  • dhclient with progressively louder squeaks and whimpers

This reminded me that I was a long way to go from enlightenment. Ordered thinking and documenting attempts would have helped me in the long run.

Government of McAfee

Here’s some workarounds for debugging in Visual Studio while McAfee is activated:

  • Run as Administrator: set it for your bin\Debug\*.exe file (Compatibility tab)
  • Run as Administrator when launching Visual Studio: you will see “(Administrator)” in the title window.
  • Pick a port other than 21 for localhost FTP testing
  • The AutoIt TCP server didn’t get flagged; I feel lucky for that. Maybe it was set to Run as Administrator too.

It was not the doom and gloom that I was expecting, although I did lose a couple good days setting things up. Hardware simulation is groovy, but deadlines dictate constraints.

Check if Windows Vista, 7, 8 or 8.1 is activated

You can exec this as an external proc from C# too:

slmgr

You’ll have to make sure your IDE and/or binary is set to Run as Administrator. Windows is pretty strict about that now.

For Windows XP, it’s “oobe/msoobe /a” or something similar. I have not tested it.

Even with two activated licenses, I still get the activation notice. Bleh.

Repair after plaintext

If anything, KPI is useful to keep people in-line with churning out units; human nature stays apathetic and even open troves are unattractive. Even so, it was suspicious to run tests on public machines. First, do no harm.

I went and bought a similar machine. Management didn’t like it, but neither of us would compromise on the customer. Even if I’m not reimbursed, at least it has a nice keyboard. Plus, I can finally walk around and work anywhere, a true technical romance.

I’m close to finishing the port. Finally, I can get back to the other good stuff to learn. Momentum dictates that I crack open a book on Windows internals or run through Pelles C with Petzold.