Workstation emigrants

The way to beat Excel is to create superior forms of visualization and interaction with data. Everything is tables, but summary reports are output: don’t we control the pieces before them?

PivotTables are great because you can summon up workbooks from each data cell. It’s an embedded view. The equivalent with <td> tags and hyperlinks just doesn’t seem to cut it, unless there’s some rounded rectangles and faux gradients, shiny buttons.

If there were a way to use R scripts as component replacements for Excel operations – I could start there. A clicked link is a REST GET, just like a double-click in the spreadsheet: you would get a different perspective (slice) of information.

I want to work on free software at work. Anything that gets me a step closer to that ideal is an acceptable compromise. Even if it’s harder. Especially if it’s that.

Advertisements

Elder programs and RAM tenants

Why data structures – and algorithms – if we can relegate everything to disk I/O? If performance isn’t a concern and hard drives are cheap, like with automation, we could get away it. The operating system would schedule our tasks, allocate the memory, and clean up afterward. Years and years of development time has gone into making processes Just Work; why mess with a good thing? (Embrace the command-line; embrace the shell.)

We go back to our original assumptions: performance matters, and storage isn’t cheap. Supposing this, why would we want to “wear RAM out” by keeping programs long-lived, sipping on current and squatting in turns? That more expensive memory doesn’t have moving parts, for one. For another, a crashing program might necessitate a reboot, but a crashed disk means mechanical replacement, lost data, and so on.

Our professors could have said, “You are learning about linked lists, stacks, trees and queues because you will write artifacts to eternity. Such will be your authorship that these processes will outlast workdays and weekends. Scheduled tasks are as the brief flame of a candle; you will write slow-burning stars, fusion suns.”

That leans more toward systems programming. There’s no other process we would want to fail than those in user-space; the kernel and the daemons need to last as long as power drives the fans, please.

What kinds of user-based programs would benefit from long-running processes? Anything having to do with real-time, although those would seem to stick very close to the system, bare-metal layer. The criteria would be a program that exercises the allocation of dynamic memory, creating and deleting as is wont. (That still sounds like server-side to me.)

Spreadsheets are one example of a user-side, long-running process. You could have Excel open all day. A few people could be attacking a shared workbook at once. We really take copy-paste for granted. None of us bat an eye at transferring the goods: if it looks like a table, we trust Excel to understand it. Rows are inserted, cells are deleted, macros are executed.

Games are another example. You don’t want memory leaks forcing inopportune reboots. Gamers are likely to keep the thing running while their GPUs whine, but that’s the whole point of having enthusiast hardware. Even if most communication is client-server, the game needs to be receptive to controller inputs. It needs to manage a relentless game loop and maintain responsiveness.

How does this all relate to treating people as decision-making interpreters of a business process “language?” What is the best way for them to remain alert and receptive to status changes and aberrant unpredictables? If they are the gatekeepers of data transforms, I imagine stuffing everything into a database and rigging up facades. Is that the easy way out?

With enough statistical interaction, can a program learn a process? Is it all just gun-kata?

Programs as objects in MVC

I used a batch script to return error codes to an AutoIt program. RunWait() returns the exit code. Essentially, the GUI is the view and the script is the controller. Instead of one monolithic program with object instances, each program is an object. I could communicate through single-digit numbers. It would be a short step to have a C program manage an SQLite model; the MVC trifecta would be complete.

Do the same object rules apply? Like tight coupling: how intimately should the GUI know about changes in error codes for the controller? And likewise the model: how closely should each program hew to the domain? Should the GUI handle each error with aplomb?

I guess there could be an “error interface,” which every program-object knows about, for which it may have different behaviors according to its context. I could hide a lot of elaborate hoops in the model for error handling, while the GUI could just be the sweet, silent type.

We’ll click mice, but make it worthwhile

A day’s work could be defined as a collection consisting of

  • tasks to delegate down the line
  • reconciling discrepancies in rules (caught automatically by our magic system, flagged thereby and forwarded upstream for decision redirection)
  • stepping forward and backward through a process graph of actions for auditing by manager request
  • putting out fires/damage control
  • birthdays and baby showers
  • coordinating responsibilities among horizontal co-workers

I would want the application(s) to minimize “blocking I/O” circumstances: anything that prevents “go forward” of a person’s decision-making. Basically, anything that must be manually checked, manually verified or verbally confirmed must be automated to the point where only decisions remain.

People make decisions; programs get them there. GUIs are the visual representation of piped streams between invokes of an admin’s harried fingers. They are the daily changing spec, the waterfall we can’t escape: everything that prevents us from a robotic future and true leisure.

Excel, the killer app

Excel is awesome. You can’t beat it. You can’t rewrite it. It runs on Windows. It can be scripted. It can be used to create forms, labels and perfectly aligned documents. It’s fast. Which of these can we use as an attack vector?

If it’s true that a program could not be better written if the author is smarter than the new person writing it, then I don’t have a chance. Thanks to iteration and free software, this doesn’t have to be the case: even more brilliant people could work on my project.

Substituting Excel is not the answer. How do we work with Excel while providing concurrency, verification and real-time feedback?

A helpful program could be a view into a table with enough columns to describe the problem-decision context. The back-end database would pull the raw data from target sources. A contextual database. Databases aren’t known to be easily transformable, right?

Maybe there needs to be a language definition: synonyms, even. A column in one table from a foreign database is named A, and we designate that our column B in the dynamic database is equivalent. Whatever target column needed changing would be an update to our definitions rather than a change in our schema. Our database would pull information daily.

There would need to be a history of definitions, an etymology. And “linguistic” translations that knew column C now used to be column B connecting column A, and here’s the data for both tables. Hopefully this doesn’t happen too often; instead, we define new columns, new synonyms (2nd-generation immigrant type versus legacy), and manipulate the controller to hide the old columns.

The retrieval could bog down after a while, getting unimportant columns with all the new ones, but we could change our requests to the dynamic database when it generates itself.

Logic verification

Programs only ever work with true or false. Well, if I want to maintain sanity – yes – programs only ever work with true or false. Can we write programs which verify themselves? At least by comparing their states to a set of rule statements.

The interface disconnect is that no matter how much the database records confirmation of a receipt, the actual tangible object is flimsy and lightweight: prone to flights behind the printer, or into the nearby trash bin, or lost among cartons of new boxes. The eventual turn is people blame each other for a broken process.

It would be nice to have a “repository” for process actions. A source control for business workflow. Managers would get a nice graph showing every decision point. They could click and see the specific data that was floating around at that moment, a virtual snapshot of real files. They could also see annotations where the process broke down; i.e., where reality (wind) messed with the system (paper flying around and getting lost).

We’re trying to model a real process, and reality is about the leakiest abstraction there is. Instead of coping with hard-coded processes, there should be a way to also handle failed processes. A divergent graph where process actions are pushed into the repository reflecting an improvement (moved the trash bin away), and then joining back in with the trunk.

Or even better: the workers themselves, these end-users, can insert that. They are self-programmable, goal-seeking, autonomous agents. We should capitalize on that.

Paen to information

Suppose we have a set of programs P which completely encapsulates the state of the business processes of a company C. What would that mean?

  • End-users control decisions that are logged and audited for quality control.
  • End-users are empowered to act within the process framework.
  • End-users are not enslaved by “blocking system I/O”: it’s always a “person endpoint.”
  • The system is able to articulate actions as natural extensions of the business process in a logical format.
  • The system may create inferences that could have reflected the end-user’s mindset at the moment of a decision (the end-user is defensible).

Again with Bret Victor: he mentioned something about how a Dr. Shannon didn’t give a flip – or wouldn’t have – about the sexiest tech trends this side of the iPhone (or Javascript), but that people would always value the exchange and “play” of information: its collection, its operations, its implications.

Or maybe it doesn’t start with programs, but a set of logical axioms: a bunch of variables, and things which must be true of certain cases (constraints), and decisions which are either critical or can be optimized by an individual’s “order of operation.” (As much as possible, I would want to avoid “meta-programming” or whatever scariness would occur if we abstract too far.)

The real challenge is adoption. How do we get management to accept novel products (programs) in the middle of their established workflow? I can already see people relaxing: “if it comes in my inbox on time, why should I care more for it?” Well, then these programs must operate “in media res,” in the middle of things, without interruption, while increasing user productivity.

How do we first create a world? We begin with letters, words and sentences. They build on each other until you must pause and consider: the next idea you write contradicts something from before. You’ve just built an axiomatic framework. A language that can compile itself. A bedrock of assumed truths.

States and models

A program should be able to express, during its execution as a process, the various states of some model of rules, verifying in turn the workflow of the user with whom it must daily interact.

Automation is not the arena for this. Automation is scheduled tasks. It’s raw files delivered fresh, while everyone’s asleep. It’s the roasted pork that’s pulled and served by end-user macros, collected with style-gravy and resting in slow-seeping juices on a white plate with the ding of the order-up bell.

UNIX is about programs piping programs, I/O the immigrant and emigrant, shuttled data between transition states written on the magnet platters of a metal continent, bused around by controllers and delivering single-threaded cleanliness to sane musings. I love it.

Then we come to GUIs. What the heck are GUIs? Suppose a task – like those of an admin – aren’t for office folks. Suppose their job is not to create scripts, but to act as human REPLs for business processes. Yes – interpreters of a vast scheme whose hierarchy stretches to people I may never meet, except if I gird myself in fanciful accomplishments re: machine learning, linear optimization, etc.

Then it would make sense to have a GUI: it is the view of data as it passes between states. Those states define the business process. They cannot be automated insofar as they require human decision makers. The conclusion becomes thus: people are programs, but slotted in at critical branches, who possess the remarkable properties of adaptation and learning.

My goal seems straightforward: take away everything in which they do not require input, and give them enough levers to exercise judgment and sane audits. Treat them as programmers: they are iterations of a business process; empower them to improve it.

The trick is to create tools – GUIs! – which enable them to adjust and adapt according to situations that sometimes aren’t their fault and aren’t in their control: they must make the best of the circumstances. But as human beings, the system should be slaves to them, not the other way around.

The girl I hope to work with

The food truck is the cafeteria; my car is the hot nest: I recently promoted myself to the tiny microwave room with the disposal, where a furtive fifteen could be spent over mouthful-labor, shoveling rice and beans and seasoned beef into a gluttonous gullet, justified in post-work cycling and daylight reward – if I could finish some focused task by the third break, I could go the three-fifty of a pretend engineer’s salary and mingle with Google colleagues.

I loathe it for all of the ideology which I perused in impressionable days of high school or college or some foggy moment where life was a blur of slights and underachievement, but I think I’m going to learn the Win32 API in a sort of deep way. Because I mean if you want to be the next Bret Victor and your dad already scouted electrons before you were born, the logical step is how the hell do you reach the nirvana of GUIs that are evented creatures conjured from custom pointers, and not merely another set of buttons to balance a list view?

Between MFC and K&R C is a Windows that takes compiles and capitals, where pointers are passed or cast, where promise proposes design in integer encodes and a litany of constants. I don’t think it’s much different, in the context of event-driven programming – at the level of C – versus Xlib or Cocoa: there will be callbacks, abstracted typedefs, and buckets of #defines. Right?

The crawlspace of the Internet

Every encountered problem in the narrowed mind had been mentioned before: the Internet was the scattered library, a construct of limited custodians and a panoply of one-off solves. All that was needed was a quick eye and enough tabs to feed the lengths of Google searches. Keywords were keys; forums the gardens where thoughts came to roost.

He figured there existed a critical threshold. Where search ended, novel work began. Until then, everything was a matter of language and fit, features to plug as specs asked, and no more was pushed. Meanwhile, home became the humble hovel, where machine code was the reminder of his mortal mien.

A year had passed, and he could count the number of instantiated arrays by a single-digit hex. It was data gleaned and data cleaned, I/O loops scheduled to outlast turnover.

The one bright light was that he was always learning something, and in the current circumstances it was a blessing, even if it was the same wheels through the same mucked trails.