Additional information...

Projects


Other project lessons (in chronological order):

After school musings of a dead dog vector-based line drawing project

One of life's lessons taught to me often by respected elders is encapsulated by the phrase finish what you start. Sure, I'd been studying French since the sixth grade (when a crush on a mocha-colored teacher from Haiti with glorious accent got my attention) and was doing very well in absorbing pages and pages of foreign words to be communicated in meaningful combinations. But, my high school had started a computer science course upon receiving two shiny new Apple 2e computers. How dare they schedule the first course in the computer science sequence during the same academic period as my French series. Another crush on a high school teacher and trips frolicking with cousins in Quebec kept me true to my elders' advice. Luckily, I performed well in my trigonometry course as the teacher doubled as the school computer science teacher (she actually had a computer science degree from USC - southern cal, that is). So, she gave me the keys to the classroom, suggesting I stay after school and get to learn programming BASIC on my own. I was quickly drawn to vector-based line drawing (a precursor to today's popular pixel jockeying fad - line riding, perhaps). I developed a friendship with an aspiring artist (hey Tim, do you remember how we got hooked up on this?) and we hacked away at my first application. He drew out sequential scenes in a dead dog adventure on graph paper and I segmented the drawings into line segments of color I coded into the lines of BASIC I was getting accoustomed to. Basically, as a user, you experienced driving a car long enough to run over a dog in the street. Then, you had to choose what to do next? Would you drive on? Would you stop and put the dog on the side of the road? Would you try and find the owner? Depending on which steps you took, you got the next scene from a random list of scenes that was appropriate to your choice. Always the potential for a happy ending. Always a potential for conflict with society or a mad dog owner.

The main surprise was how much coding I could learn behind the scenes of what a user would actually see on the screen. I could play with all kinds of mathematical equations to incorporate interesting effects to how the visuals played out. I added animation to car wheels and car movement (I believe Tim helped refine that greatly). And still, the user experience was all about the eye candy. Nothing was as attention getting as the humor portrayed in every scene by simple, added artistic lines and animation. The lesson was how important the graphic artist is to the whole computer experience. Even if it is as simple as the design of a pretty icon placed on an attractive toolbar, the user experience is only as good as what the visual cortex receives (later on in life, I would read about the importance of sound in this consideration). Enter the screen saver. That simple piece of software that waged fierce competitions for use (or as a metric, screens deployed upon). I heard Tim postponed college to make a grand a night as a singer for a popular punk band. My French degraded as soon as I stopped taking courses two years later, but it sure did help me learn Spanish in a quarter of the time.

The world of punch tape and a simple program of chance

Soon enough, my beloved high school peers began bringing in Apple 2e programs to run on the two computers I'd been training on. Demand for the box use rose geometrically as a new game entered the mix. I was lucky if I could have the room to myself even after three hours of tennis practice. Well, actually, I'd make deals to let groups of people stay with me as long as they used just one of the two 2e's. I was always willing to watch them (my first hint I'd be interested in machine usability when part of an interesting process) when frustrated with my programming progress. Being the soon-to-be-called geeks of our graduating class, these guys didn't much care for the authenticity of the events being crudely displayed on a 1980-era computer. But, give me a break. A football game without the ability to kick a field goal? You could choose to score three points automatically if you were within the twenty yard line. How goofy was that?

I wanted the ability to offline the field goal process so that the game could continue without me needing to change the single file executable frozen on the seven-inch floppy disk. I chose to write a field goal kicking module on the Hewlett-Packard mainframe churning away in the far corner of the room (I never did ask much about how the machine had been used prior to the Apple 2e availability). Programming on that huge box was noisy as electricity seemed to surge through the components and make relevant surging sounds. And, once done, the pounding of the small hammer that made holes in the punch tape was no fun what so ever to listen to. Still, I found myself adding more and more realistic variables into the field goal experience: Wind direction and speed, field turf and condition, snap accuracy, hold accuracy, and kick accuracy. Choosing to kick a field goal at an appropriate time in the game was quite a study of situation by the time I was satisfied with the experience. But, integrating the module required some silly behavior: A player, upon reaching the mid-field line (center line for you CFL fans - fifty yard line for most of you) could start to consider whether to kick a field goal or not. If you sat outside the twenty and tried the field goal, you'd get a free walk to the end zone if you made the kick (gaining six points but with the knowledge they were really only worth three). Or, if missed, you'd fumble graciously to hand over the ball. If you were inside the twenty, you'd choose the field goal if you made the kick or you'd fumble (a little less graciously) to hand over the ball if you missed. The system worked and we incorporated the field goal module in our more serious matches. My surprise was how satisfying the opportunity to use an archaic programming machine still feels to this day (hey kids - I used to store my programs on paper tape and feed the tape back in to make the program active again). My lesson was that modular programming was an inevitability and yet still a black art that the whole effectiveness of the Web depends upon.

Looking inside with a blood-typing assembler program

Of course, as a senior in high school, it was not cool to hang out with the junior year students after school in the computer lab (which had not grown in assets despite the popularity of software being brought in competively). Besides, Advanced Placement credits meant lots of respect of your senior class friends (so, hitting the books was slightly more popular) while cutting down on the cost of college. And, spending three years of futility with the same tennis team friends begged for a little more commitment in our last chance season (that is a very sad story to be told eslewhere). I turned to the city community college to keep the programming experience alive and kicking. I forced my best friend to accompany me. As a result, I got to program two class projects for the price of one (thanks Dad for paying). Building upon the enthusiasm of trying out the paper tape technology that made expensive core memory less relevant, I decided to degrade to punch card technology where each statement in a program was stored on its own stored on its own archaic “IBM punch card” where 90 columns of 12 rows of punched holes could store a string of up to 90 characters (without the lines, but with little numbers to reflect columns across the card from left to right). Everyone had heard of the dreadful shuffle event of humility - when you dropped your cards and got your commands out of sequence.

I would like to brag that we only used zeros and ones in that course (and as Dilbert says, some of the zeros didn't even work at times), but instead we used a very effective and concise instruction set with which to create programs out of physical machine instructions (move the data in register x to register y). I enjoyed doing that (but nowhere near as much as when I received a much deeper understanding of what I was actually doing in grad school) and found myself competent enough to let myself think I was going to write the program of programs in the course of thirteen weeks time. Being competitive with my best friend, I assigned him a simple blood-typing classification and recording database application. Sure, I would do the work for him, but it had to be a less impressive product than my own. I diligently spent equal time on the two projects, amazed often by how clearly I could see instructions actually doing stuff in the computer in my mind's eye. I got overwhelmed by the complexity of my programs of programs. I never got to a place worthy of submittal. Having done well on the tests, I received a B in the course. Having done a nice, clean, implementation of a blood-typing program, John received an A. The surprise was how motivating it was to program when you could actually feel the computer doing the work for you (mind the paper cuts). The lesson was iterate, iterate, iterate. I could have written a program of usefulness worthy of submittal and then added bells and whistles one by one (removing those that did not jive well with future editions). I am not talking software engineering here (though that is a pretty useful perspective as well). I am just talking a productive approach to rapid prototyping with extreme programming techniques (or, looking back, I can say that now).

Political boundaries in only two hundred lines of code

By the end of my high school senior year, I had learned to revise programming code much better than I could revise a term paper. Inefficiencies jumped out at me as I reviewed a printout of my code. I was well behind on my senior year science project (I had won an award in the state science fair with my junior year contribution). The only way to finish something respectable while dealing with Senioritis was to make it a self-assigned interesting programming assignment. Did I tell you I loved maps? Crazy about maps. Something about my mind's eye being able to experience a place (fictitiously, of course) from a map of it made maps the source of much pleasure (even more so now that travel is part of the experience). I decided to take the map of Connecticut and break city boundaries into line graphics that could be drawn on the Apple 2e (this is not a story of code reuse if it seems we're headed that way). That looked really nice.

In a questionable move, I decided to write an algorithm that would set political boundaries based on map characteristics (and a table of town populations). I didn't know about the famous map coloring problem (in fact, I didn't define anything as problems - just opportunities :). I did no research into potential political allocation algorithms. I just banged out an elegant algorithm (using my new found iteration skills) that was at the core of a map coloring program that showed newly developed political boundaries. I changed the number of electoral seats in the House of Representatives from four to five to six to (well, all the way up to fourteen) and let the algorithm determine which towns would be assigned to which districts. The maps were gorgeous. I tested the algorithm on the counties of New York state. They looked good too. I made a cool poster explaining my approach and displaying the maps. At the Connecticut State Science Fair, I won an advanced scientific calculator from a woman who had lost her husband and said he would have liked my exhibit. I was pleased with myself. Then, two of my so-called buddies pointed out to me how useful my algorithm would be for a bomb-dropping planning session (it was true that my algorithm also found the center of each district's population geographically so that a capitol could be built there). It bugged me for weeks that those boys were of almost pure German heritage and too enamored in the wars their ancestors had supposedly lost. The surprise from that project was the fact that so much could be done in so few lines of code. The lesson was that you might never anticipate the evil your invention could be ultimately used for.

Verifying loss tables for an international property and casualty insurer

I wish I could pretend to have participated in some really eye-opening programming assignments during my undergraduate experience. There were none. I had chosen to be a business major. Most assignments had to do with crunching numbers in formulas in a spreadsheet, balancing columns of numbers, playing out a case study in corporate america, or learning about the psychology of human beings and how they motivate each other or make decisions. There were no late night geek sessions or special keys to hidden caches of next generation computing machines. Basically, there is no Bill Gates story here. But, I became a well rounded human being. I learned about how the real world works. Yes, of course, it was a depressing time, but it was a back-drop from which the rest of my life could be judged (and thus, wildly exciting and productive). And, I did fun things with real people in the physical world. We should never underestimate the need for that to the human psyche, eh? Business assignments were straight-forward. There were no stories of going to the computer lab on Friday night to do an hour's worth of coding and waking up Sunday afternoon having missed the first game and in a deep sweat for not being able to find a coding mistake that was having mischievous consequences. The question of how to get back to the world of computers was an upcoming ten year story (of which you get a jist of it from the projects).

Anyway, I was out in that so-called real world, auditing the actions of so-called real people for the good of society who could be harmed by their misuse of fiduciary power. I was writing many formulae within Lotus 1-2-3 spreadsheets on very heavy portable computers. And, I was making a name for myself in the insurance industry. I had a knack for actuarial tables. I could almost say I liked them (well I did like them more than most auditing tasks). Then, I learned something that many people probably don't know. Auditors are not even allowed to write programs to verify people's analyses. We have to use algorithms that have been tied into programs that have been frozen - meaning that they aren't alterable. Every time I used a program to verify something in the financial statements, I had to check the size of the program in bits, date of the program to the second, and name of the program to the letter. And, since commercial insurance companies make so much of their profits based on state-approved actuarial loss tables, the program that calculated loss table ratios of experience versus charged premiums had to run on a special machine - one that could not be tampered with. There was no external drive. It was loaded with the program in Cleveland and sent to Boston for use. It was a Radio Shack Tandy machine that looked as much like a radio as a computer. Just one more box for me to lug around to the client (making those defiant days when you just have to take public transportation to be socially conscious a real challenge). With all that, I got reprimanded once for leaving the machine out of my sight in the home office (not the client, mind you), while I caught a quick lunch. Right — as if I was about to bring my radio computer to Sbarro. I was being brainwashed to hate computers. Perhaps I shouldn't have mentioned my interest in converting my career to become a process consultant so I could participate in programming solutions to societal issues. I hadn't considered the conspiracy angle before this exact moment (hence the benefit of writing one's story). Anyway, the surprise of this project was the treatment of the computers used as our allies. The lesson was the respect I got for potentially tampered code (it could get a multi-millionaire to invest in a stock she otherwise would not). OK. Fast forward to more optimistic times...

Airport runway control at a fictitious airport

After a series of programs that seemed no more advanced than what I had tinkered with in high school (but which taught me the Pascal programming language), I had finally found a program assignment that smelled like challenging new skills development. I was taking that most dangerous of dangerous classes, Data Structures — the course where hardcore computer scientists learn about the metrics of their craft (speed, size, and the trade-offs therein). That course is just too addictive while being strict in its vocabulary and approach. All too many computer science students come out of that course inside a new found, lead-walled, thinking box. That affectation was a big deal until recently when container objects started being thrown around as part of computing frameworks (and no longer need to be the all-intensive focus of data structure loving individuals). I was one of the addicts. Coding queues, stacks, sorts, and linked lists was just too euphoric for my own good. Connecting them to cute little, real-time visualizations made of x's, o's, and line segments made it more fun yet.

Luckily, some clown at Cornell had unleashed a wicked computer virus (the first successful worm) on to the national academic network and I was forced to deal with a whole heluva lot outside of the airport simulation task at hand. Like a wax-on, wax-off exercise of patience to teach me the insides of a Poisson distribution, I struggled against La Bomba - a cute little Macintosh graphic that filled the screen with a black, mostly circular bombs that grinningly identified, 'your core memory has been corrupted and the machine has frozen solid as a result'. The virus didn't just infect core memory. It infected the floppy disks we saved our programs on. Any one of us on any of the two hundred Macintosh boxes could read off a diskette and drop the bomb on all other 199 users in a manner of minutes. And, should we get the whole campus allotment clean, some externally networked box could bring it back to us. How utterly frustrating for those of us who had no idea what was going on (those of us who actually worked alone on our programming assignments because we worked full-time during the day). I kept imaging what would happen if real air traffic software ran in a similarly networked manner, across the world. The project took three times as long to finish as it should have (and ten times as many floppy disks). For me, data structures was not the course experience other shiny new converts of later years had experienced (but, don't get me wrong, lectures were pure pleasure, the book was as useful as the Meaning of Life, and the tests were as fun to take as a jello swimming event). I had battled an unknown force which made precise, mathematically sound code of little value. My surprise was how ruthless a computer virus could be to a highly regarded computer lab. My lesson was to be on the look out for project issues that might not seem too important but could become a number one critical path legend.

A first database to manage credit card billing allocations

So, I couldn't get back to computer programming projects through the firm. I chose to go back to business school and take an advanced degree that got me programming intensively. Information science can be about the worth of information, the structuring of information to create knowledge, or the meta data description of information. Or, information science can be about discovering, processing, and disseminating information without necessarily knowing the meaning of what's being passed around. To get back on a computer science track, I choose a program with a strong focus on the latter (while learning all you can about the rest of it). Most of the programming tasks in business graduate school are nicely academic, especially when you are working nearly full-time and can't dally with programming teams all day long. Since a project isn't quite the learning experience without other people to change your thinking, I choose a representative project from my work life at the time new ideas from grad school were filling my noggin.

Working for a credit card processing department within the second largest bank in a midwest state, I saw a lot of data as I contemplated information science. Monthly, I received a tape of transactions for every merchant or cardholder for eigthy-four banks who authorized each buying or selling involved in a Visa or Mastercard processed event. There are many possible problems that can come up with credit card transactions. You probably guessed that fraud was the biggest issue. Yes, in fact, Visa and Mastercard together wrote off $4 billion worth of fraud in 1988. That amount is allocated to the rate they charge for each transaction (something like .000000767 cents per transaction, if memory serves). Each of those eighty-four banks could choose to protect themselves against fraud via various services (a sophisticated market, for sure). The reporting organization that worked with Visa and Mastercard to record all those transactions was located in Columbus, GA (a nice place to go in January for a yearly meeting, I can vouch for). They did not want to figure out who to charge various expenses to. They just lumped those charges and I got to decide which bank should receive which allocation of these miscellaneous monthly charges. It was like no auditing task I ever was assigned. I got to be creative and yet try to be fair based on my sense of ethics (which had been challenged in a philosophy course more thoroughly than most undergraduates). I decided to automate this allocation process. First, I learned how to convert the reporting tape into a stream of text and floating point values (millions of them). Then, I learned how to write simple aggregation routines to get buckets of subtotals. Then, I wrote an allocation algorithm that used the subtotals to dictate percent of charge allocated. Then, being a good auditor, I created a second method for cross-checking my totals allocated. Lastly, the step I considered a growth step worthy of my business school prerequisites, I called the eighty-four banks and spoke with the person who has power over fraud protection decisions. I introduced myself and sent them my algorithm in a written format (via interbank mail). Then then called me to challenge my decisions (which did change over time as I learned more about bank behavior - the lesson is to charge those who can change bad behavior the most - not those who can't really change their behavior based on mitigating circumstances). The surprise the project offerred was the thrill of automating a tedious manual task. The lessons were many about human behavior and the ability of a computer program to outperform a human being in allocating a large and tedious series of numbers.

Job control and COBOL programs that pay field agents due commission

Moving on to another organization, I chose a program where you work five information science projects in five years and then decide whether you want to become a director and dive deeper into a specific business line. Perhaps you've met a few recent computer science graduates from prestigous programs in your lifetime. They tend to think they can do anything with a computer and have a hard time relating to people who don't feel the same way. So, this organization wisely put new recruits to work on very small teams who are dealing with very tedious and cumbersome problems. Although a step up from punch cards and machine assembly language, writing job control instructions to an enormous mainframe certainly felt just as archaic compared to today's computing environments. I did battle in the trenches with six other programmers who basically duct-taped pages and pages (screens and screens) of spaghetti code together to get field agents and remote office personnel their monthly commission checks accurately on time. The system was written twenty years earlier and tracked over four thousand individuals for sales volumes, profitability of policies, and policy cancellations. The more you sold, the more commission you made. The more profitable the sales, the higher the commission percentage. Yes, these software programs were from the days of massive COBOL language programs written when GO TO statements were still acceptable.

My favorite memory of that tour of duty was the fact that each program had a name such as the maintainer, the accumulator, the balancer, and the reporter. They ran in sequence overnight during a six hour period. We would run four practice runs to make sure our monthly programming changes worked before the official run would go and actually cut checks, print letters, address envelopes, and send the pieces down the line to an in-house post office. The software had many checks and balances in it. If any assertion was violated, the whole run would back out to go back to the start. If that checkpoint was near the end of the six hour run, the backout might add another two hours to the night. How bizarre that a series of computer programs could create such a family atmosphere of support and communication. One error would put us all back to square one. My one year-end run seemed to last twenty hours (each of us had to do an overnight actually watching the run interactively through screen print-outs that reported the status of tens of milestones within a run). The lesson I learned was how mission critical software required a more conservative approach (agents who got overpaid weren't very likely to return the overpay). The surprise was how personal and quirky the humans were when dealing with such a boring, calcuation-intense software environment. Some of my peers made fools of themselves with their high and mighty attitudes. Others surprised me with their passion for work and consideration of every person throughout the organization.

Cash value illustrations in a C sweatshop

Moving on to year two, I found myself parked on an all male development staff who were forging ahead with the C language and making colorful business charts for customers to consider when looking at various life insurance policy options. An insurance company can suggest that you let them manage all your free cash. In that case, they bundle investment with life insurance such that you are covered for risk of death while accumulating a cash balance on your policy. We wrote the calculation engines that crunched future value of premium accumulations and subtracted out monthly, quarterly, semi-annual, or yearly premium payments (your choice). Since every dot printed on a piece of paper was within our control (a liberating experience after character based systems), we challenged ourselves to print the best looking business charts (bar charts, line charts, pie charts) that would graphically depict the yearly numbers that were printed in multiple page tables. The guys were close knit and, well, guys. My mentor actually held a swearing session with me to make sure I could fit in with the team culture.

I had dabbled in C previously and had come to the conclusion that there was quite a bit of flexibility in C technique and formatting. And, so, I decided to take up conversations with other C coders throughout the organization who had a reputation of being very knowledgeable about computer science in general. I came back from my first consultation to find everyone else working head's down and my supervisor motioning me into his office. That's when I learned, through a red-faced tirade, that I had committed an act of treason (labelled insubordination, but obviously more serious than that) by speaking with the enemy. My supervisor had managed a Montana silo for six years prior to choosing a more commercial occupation. He had one of two keys that could have launched a missile aimed at the USSR. He did not take fraternizing lightly. I spoke calmly and clearly to let him know I was only trying to help by scouting intelligence that could make our work more productive (and my participation more knowledgeable). How was I to know that our team had been hand-picked by him to be fully self-reliant and capable of researching all needed data via internal methods (he had a nice budget for programming books). I felt like I had all my stripes ripped from my sleeve and was sent on latrine duty. Only after eight months of working quietly and diligently (and participating in sixty hour work weeks without comment) did I gain his respect back. He was able to give me a satisfactory job review and ship me onwards with a rating of C+ (before the days of C++ being popular). A C for a C rotation. My lesson was about learning to gauge group dynamics before acting. My surprise is how still to this day, I imagine what it would be like to be trained as a military specialist. The thought of it is quite terrifying and yet those with similar personalities to mine of years past were sent to battle enemies they knew very little about.

Evangelizing groupware in a company of 35,000

Just as I had submitted myself to considering the computer's sole real value as a computation machine, I was given the opportunity to drive improved computer-mediated communications across time and place for business groups attempting to improve their workflows and communication quality. Things got wonderfully messy as computational functions spread across minds and machines. I spent a year participating in insightful case studies of top-down versus group-driven decision-making. Visiting nurses who knew their clients well made huge jumps in life quality and efficiency by joining the electronic communications realm, downloading patient-support information from anywhere at any time, and uploading their mandatory reports from the luxury of their homes. Telecommunications engineers thrived in a group process that let them negotiate their work without top-down favoritism that was the rule of the day when 250 engineers could not be managed any other way - only those who had enjoyed unfair benefits in the past fought the system. And yet, other groups so used to top-down communication patterns could only use their new tools for gossip and chatter unbecoming of the investment made in technology by their employer. They key, I learned, for thriving in computer-mediated communication processes meant having a meaningful role, valuing connections with human beings versus machines, and taking a pride in the quality of your contribution to shared knowledge and improved group process. I could not judge others for their lack of these qualities, but I found it came easy to myself and so I became a little manic in contemplating the potential.

Shared underwriting infrastructure for regional commercial insurers

The rubber hit the road (literally in the amount of travel to establish trust with teammates) as I was given a year to build computer-mediated communications infrastruture for fifteen regional lead underwriters for sophisticated mid-size commercial risk insurance policies. These people are incredible communicators who spend their days madly coordinating activities among risk-assessment specialists, financiers, claims specialists, field agents, pricing and policy-production specialists, and, of course, existing and potential customers. They are used to connecting one-on-one with their peers in other regions by telephone, and meeting face-to-face a few times a year to discuss strategy and financial results. How could I convince them to become more transparent and documenting for the good of the order? The year was a fascinating journey to the four corners of the U.S., but how would a trained evangelist make in-roads with such a savvy group of communication veterans who enjoyed the responsibility that came with their hard-earned authority? I learned the most important lessons in the attempt. These things take time, but are enevitable as changes in communication patterns among the young move forward in time through their life stories. I learned to admire the current state of communication evolution and respect it for what it is while still seeing the better future that is to come. And, I learned some manners and business protocol in doing so. Thanks to all the regional underwriters for opening your world to me!

Task management for paper mid-size commercial policy producers

I guess I did alright. I was passed from one senior director to another. The next task: help organize six policy-production centers into a single group of responsible hard-working communicators. Two-hundred and fifty people deep in the trenches of providing the paper support required by law and common-sense good customer relations. What an accumulated case-history of legal requirements that one is! Anyway, there are good days and bad days in each office. Some days Richmond is sailing smoothly and Napierville is getting crushed. Others, Denver is living the good life, and Richmond is in crisis-management mode. Why not share the burden and reward one office for bailing another out when the opportunity arises - returning the favor when the favor avails itself? My job - provide the shared communications infrastructure so unbiased mediators in Hartford, CT could pass the work around between offices equitably. I built a task-management process to help each worker manage their workload while at the same time transparently sharing their current state behind the scenes to a benevolent management team looking out for them and the customer. Here comes Big Brother, right? Nope... they loved it and I loved them. A year-long lovefest and anthropological study of management theories. Richmond staffed by college graduates just out of school. Denver staffed by secondary-income earning spouses. Napierville, Rochester, Buffalo, and Dallas interesting mixes along that continuum of homemaker to careermaker. That one was the best win-win-win-win job of my career to date. And, boy did we have fun!

Web site for an early adopter technical recruiter

Life is a fascinating journey if you learn to listen, contemplate, and meditate closely. The Web phenomenon took off in 1994 for me. There was more to life than connecting people to build a better insurance industry. Where else might we be able to drive a superior computer-mediated communications process to improve quality of life? Nothing had had a more substantial impact on my quality of life than the opportunity to participate in work projects that introduced me to new people, new ideas, and new societal roles. So, why not help facilitate the process that connects motivated worker with need-identified employer? Basically, I built a humble Web site that demonstrated the underpinnings of what Monster.com has become so impressively twelve years later. It was so easy to do! The internet service providers of that era were trailblazers with bubbling passion for the changes possible in society through their efforts. An incredible heady time for meeting new people in discussions where ideas were emerging and reinforcing through empassioned discussions.

3-D virtual environments for the Web

Authoring and editing text-based Web pages got me this far, but the thought of a shared virtual reality burst out of flatland. Just my luck, a spot opened up on the Virtual Playground project with a very comunicative and organized project director. He and I signed up to help our Taiwanese client demonstrate the value of an affordable, on-board graphics co-processor through shared virtual worlds accessible via the Web. Paul introduced me to the fascinating potential of multicast communications between large research laboratories and I researched the more prevalent client-server protocols to reach out to small, home users. Our client gave us so much freedom to design as we wished that I lost my program manager after six months. He seemed so concerned and uncomfortable that our client could not agree on a specification - a by-product of his formal computer science education, no doubt. I took over as product manager just as Paul had put together a real great system architecture based on Java and Java 3D technologies. I worked with stylish traditional building architects and artists and prototyped all kinds of cool features within our Netgate Shopping Mall and Science City prototpyes. I delivered a keynote address in Taipei along with demonstration of how quickly Web networking protocols could be hooked into navigable 3-D cyberspace. We built a process by which anyone could load any VRML model from anywhere in the web into Science City. I investigated models from all over the world. I worked with a Chinese artist from Kaohsiung in southern Taiwan and we built a fun little scavenger hunt for 8 to 10 year-olds to play in while racing to fill their team home with a shopping list of items. An earthquake abruptly ended our twenty-two day experiment attended by kids all over Taiwan, fifteen at a time. We have very few technical difficulties but did drop a lot of multicast packets across the Pacific. The client-server model ruled the day but multicast seemed so much better.

I don't expect to ever feel better on a day-to-day basis than the two years the Virtual Playground project provided me with such a dreamscape skunkworks. I learned more about people, life, and technology in those two years than any academic experience could possibly have provided. I saw a possible future we are likely to all gravitate towards five to ten generations from now. It is a grand vision, but quite scary to people like me born and raised in the late 20th century. We aren't wired to thrive in virtual cyberspace yet and so we don't know where we are heading. The only thing I feel sure of is that we won't really recognize leading edge technology people two hundred years hence - not even sure we'll recognize the average Joe on the street. So many questions about utopia and idealism, pain and suffering, morality and mortality brought about by playing in a virtual world one designs from a platform one designs. Even more questions as people inhabit artificial avatars and you don't know the nodding skateboarding kid across from you is driving his experience from Malaysia. No wonder people get lost in these jobs - a very happy and engaged lost. I don't purport to even begin to suggest what it all means for us. I just can't imagine turning back. It's a waiting game, for sure.

Procedural worlds for learning to navigate in 3-D cyberspace

Handcrafting 3-D virtual worlds is lots of fun and uber-empowering. A little imagination and you are playing god to a timeless universe of your own making. A lot of imagination and you can't sleep at night. Then again, to do the kind of quality work worthy of getting paid, I leave the domain to the capable hands of the cadre of competent artists I have been meeting in my life. My strengths are more algorithmic and so I joined up with the Digitalspace guys to consider how we could proceduralize virtual world development. The result of our shared vision was a Java-enabled browser page from within groups of people could choose artwork and configure a virtual world using a series HTML forms and JavaScripted behaviors. We sold it to an insurance company that manages insurance policies for 2.5 million public school teachers. The hope was that teachers from around the world would meet in stylish virtual classrooms and discuss violence and bullying and escalating issues that came to light as a result of the Columbine tragedy.

It was great to merge my interest in futuristic virtual reality solutions with my belief in transitional paths. We demonstrated how shared 3-D virtual worlds of use could be evolved from the current Web. We made it usable by public school teachers who needed the opportunity to get a sense of the video game worlds their students were inhabitating and thriving in after hours (or during hours given a truency tendency brought on by old-school, sage-on-the-stage drivel). The teachers were intimidated and uncomfortable. It is good to know our public teachers are so gregarious and friendly in their preferred face-to-face communication styles. Video gamer teachers are hopefully just a few generations away and the idea will click. Always thankful for the optimistic dreamer in corporate America who risks there career to do something out of conscience than payback. Not that I invest much in the stock market, mind you. Might have a different opinion in that case. Anyway, not appropriate to out the company who provided us such a great ride (telecommuting friends between Australia, Amsterdam, Italy, Seattle, and the San Jose hills). More dreaming and implementation, but closer to mainstream society. Just not close enough yet.

A first book project on a 3-D graphics file format for the Web

I finally got it: The best learning experiences don't come from sitting idle in a classroom absorbing more content than your peers and regurgitating it on an exam. Sure, that exercises the brain cells a bit for short-term capacity, but it does little for long-term retension. I would have to say, the best learning experience comes from being on the hook to deliver a book to a demanding and eager audience on a topic you bearly understand in detail, but are motivated in concept. Out of the blue, a book publisher contacted me via e-mail to ask if I would be willing to co-author a book on an emerging graphical file format - the one I'd been using on the last two projects through an iterative hack-and-see process. I jumped at the chance and found myself working fourteen hour days that felt like three hour days. Four editors, a co-author, and a Web filled with fodder to consider in the book.

We sold 8,000 books within the U.S, had the book translated to a few other languages, sold the rights to the text to various software houses, and used the book to teach courses at the community college course evening program level. My interaction with people all over the world in public visual 3-D chat spaces provided context as can-you-help-me e-mail messages began to come in from every continent at every hour of the day. As my co-author had no time for being helpful on a one-on-one basis (he had earned his wings doing some very difficult technological implementations), I took on the task with gusto. Amazing how much goodwill a person can build up in a short time just by knowing a body of knowledge cold and enjoying the challenge of a is-it-possible-to-do-this threaded discussion. Seeing your name on the cover of a book in the Oxford University Bookstore front window is nothing compared to seeing the world map of everyone you have had a useful discussion with one-on-one. I am still scratching my head as to how all that came about. As they say, 'it could happen to you'.

A second book project on making HTML more dynamic

My publisher was very thankful that I was able to join a project so late in its schedule and deliver a quality product. Had I ever done that before? Hmmmmm. Don't think so. Anyway, perhaps goodwill always comes back at you if you stick to it. I was offered the opportunity to write a book on the emergent HTML 4.0 standard and was promised I would have a seasoned co-author (who had just moved to a small town in South Dakota and lots of time to write) in order to be first to market with a book on the empowering dynamic scripting capability called Dynamic HTML. Of course, Microsoft and Netscape were warring and the term Dynamic HTML had little clout as a trusted consensus specification. I was on Netscape's side in terms of hopes for creativity and yet on Microsoft's side for getting better user-feedback data incorporated in an HTML specification. I spent a very engaging day at Microsoft's author-preparation day for HTML 4 and met my competition in terms of getting first to market. Of course, our book would include fair mention of Netscape's implemented Dynamic scripting features without the HTML 4 blessing afforded Microsoft.

Cutting to the chase, I found out what first to market means. You sell a helluva lot of product, but you discover lots of mistakes soon after it hits the streets. Our mistakes were not typographical or programming mistakes as the four members of the editing team were super, but of the fluid specification type - Our examples, one by one, no longer worked in the browsers we were pitching as both Netscape and Microsoft changed their approach to Dynamic HTML. The book sold more than twice as many copy as the first (I had learned to take the variable contract with royalties this time around), was translated into fourteen languages, was sold to six software houses I know of, and had a retail tail of five years before it was sold for an overvalued twenty cents at tag sales. Oh, but I was so sure we'd get the gig for the second edition. I wanted to do it out of pride and a devotion to my 25,000 readers out there, but not for the struggle of having to review something I had already got everything I wanted out of it. The re-write never happened. The market for Dynamic HTML turned out to be oversaturated. Our book probably did the best of all of them financially. There is a lesson for Microsoft's traditional approach in general in there somewhere. And, yes, if you are envisioning the e-mail traffic on that one, you likely got it right - ouch for any author who doesn't enjoy reaching out to a planet's worth of tinkerers and wannabe developers (basically, people like myself in this case).

A platform for shared 3-D virtual environments

As the landscape for shared 3-D virtual environments exploded, many of us wondered why we had to maintain so many different avatars and learn so many different graphical interfaces and keystrokes in order to participate fully in the best online adventures. Developers with different experiences came together to create an OWorld community in order to define a core specification for sharing 3-D virtual worlds and creating translation services for connecting existing platforms into a shared OWorld experience. I attended the OWorld conference at San Jose State University with all expenses paid through volunteering my time as the webmaster for the event. As webmaster, I interacted with each key participant one-on-one before the event and got access to all their research publications after the event. I probably enjoyed the interaction as webmaster as much as the project work itself.

The OWorld conference became a cross-over event — building upon the first and second Digital Biota conferences to discuss how best to share digital life ideas and interpersonal evolution in a 3-D cyberspace based on interesting themes in biological, chemical, and physical research in the real world. In that regard, the 'O' in OWorld could be thought of as 'Organic'. Since much of the hold up in making advances in digital evolution was the fact each digital life researcher was using a different platform in which to evolve computer algorithms, we discussed the core features of each existing research platform while continuing discussions on why we should pursue it and how it could shed light on many computational problems that could shed light on the sciences. The specific discussion on evolving digital life led naturally to a discussion of shared 3-D virtual platforms in general and the 'O' in OWorld became more about 'Open'. We got a great project boost when a critical mass of platform developers attended our OWorld Technical Summit in Santa Cruz. By then, the Anarchy Online architects were looking for a new funder, DigitalSpace was wooing clients on providing low-bandwidth solutions in the Web browser, and the Virtual Playground code I had been developing no longer required a license from the University of Washington in order to share with others. Operating System developers joined us to discuss how operating system standards had evolved towards a consensus of services. We created a reasonable shared 3-D environment architecture and spent time implementing it in a subset of summit attendee platforms to see how difficult it was to adapt existing platforms for shared experiences. Ideally, I would enter a Virtual Playground environment, Michael would enter an Anarchy Online environment, and others would enter our emergent in-browser platform and we would all see each other and interact as if we were all using the same platform.

The project provided me with some of the best thoughtful time in my life for considering themes emerging in Open Source platforms. By the time I had exhausted all the relevant thoughts I needed to grasp the API I helped design, I really felt the OWorld approach could be highly successful if we only had a benevolent dictator who was willing to develop their life to making it a reality. Sadly, I had no such motivation for it could really have been an amazing ride. I just knew myself too well to know there were too many other interesting ideas out there I wanted to dabble in.

An attempt to share dynamic 3-D environments on the Web

At some point while the OWorld API was solidifying, the fall-out from various high school violent events like the infamous Columbine High School massacre reached a pinacle in public discussions. Parents of students were uncomfortable, teachers were uncomfortable, and students began changing in ways which did not seem so healthy for their optimal learning experience. DigitalSpace fielded an inquiry as to how we might build an on-line environment for discussing school-based violence so that any student, teacher, or parent could get involved from anywhere in North America. Our solution was an in-browser solution we called Meet3D. By adapting an existing Java-based shared 3-D environment called Shout3D and using that experience to repeat the process with other Java-based environments such as Anfy3D, we built a low-bandwidth, low-footprint shared virtual experience for a client who provided insurance and retirement saving services to 2.3 million public school teachers in North America. A VIP at the funding organization actually knew children who attended Columbine High and was able to influence a funding group to allow us to develop a prototype that could demonstrate how best to provide a safe and secure discussion place online.

We saw great value in providing a visual three-dimensional schoolhouse with classrooms with which teachers could discuss violence prevention and injury mitigation and even model situations where potential crises might break out. We built a mult-building campus with different schoolhouses that could be visited to discuss a variety of core topics. One of the buildings provided a place for teachers to discuss their retirement savings strategies and invite a company specialist to attend their discussions when a topic become hot and heavy.

The solution was actually quite easy to implement technically. Java had evolved far enough to allow us to create the interactive communication features realibly and Shout3D was wonderfully capable at providing a very low-footprint experience (we packed the virtual world and the whole 3-D experience engine into 144 kilobytes). We tested our core widgets with the Stanford Business School who used them to let alumni share on-line lectures as part of an alumni benefit. We tested Shout3D heavily but knew we didn't have to as we knew the development team well. Most of our time was spent in debugging JavaScript across the popular browsers of the day — and in changing the keystrokes and interactive mouse events to be compatible with our OWorld vision as it continued to evolve. We delivered our solution to the client very happy with ourselves and with having the experience to prove that the Digitalspace vision of a creative commons community around the world was a brilliant idea and an enjoyable way to work. We worked around the clock with our Italian team, Australian team, and West Coast USA team being able to work eight hour shifts without ever having to stop making progress — the now-obvious benefit of making use of time zones within a project team. I enjoyed answering Italian questions in the morning, working on code during the day, and asking questions of the Australians before retiring in the evening. The answers were always there in the morning when I got started again and we all marveled at how we felt like we had worked while we slept!

Procedural plants in augmented reality

While attending SIGGRAPH 2000 in Los Angeles, I met George Fifield who organizes and oversees the Boston Cyberarts Festival every two years. He came to our ARToolKit emerging technologies demonstration and immediately started spewing out ideas on how fantasticly our application could win over Boston at the Cyberarts Festival. I guess I was the only one who stayed the course in returning volley on all his questions and ideas over the next six months. I was convinced I could build something amazing and be the one to experience this great interactive demonstration experience in Boston for 14 days. I saw it as an ideal opportunity to get back east for the cherry blossom season and a chance to spend time with Dad in a capacity where we would actually work together.

I wanted to tie three ideas together I had been studying independently. I had become almost addicted to playing with L-Systems, the recursive string replacement algorithms that simulated plant growth. I was in that time of life I suspect everyone has when the full impact of genetics, development and evolution hit me so I saw the theme in almost every living system I investigated (from bacteria to the cosmology of galaxies) and literally felt high every day for a year just reading and thinking about the computation issues associated with life. And, I was closely involved in working with our augmented technologies group at the HIT Lab within the University of Washington. I saw a convergence where I could use an L-System to generate plants from a genome I generated that could then be mated in a augmented mating process that would generate unique offspring. The genome I came up with allowed for 13 billion unique expressions, each different dependent on different environmental factors such as gravity, sunlight, and arrangement of stems in a vase.

I did not approach the problem as a researcher although I sure appreciated that point of view as expressed so well by the University of Calgary's Computational Biology group. I also did not really have the training to approach the process as an artist which became a plus in that I really was able to stay totally enthralled in my discussions with hundreds of trained artists who came to try out the virtual plant generation experience. I made the morning news segment for a high-tech morning news program that supposedly had twenty thousand viewers in the greater Boston metropolitan area. I saw my cherry blossoms and took long walks experiencing a Boston that I found I liked a lot more than the one I experienced living there for all of 1996 and 1997. It was a perfect mix of real world time and virtual world time which I realized was the goal of augmented reality in the first place! And, probably most of all, I spent seven great intimate days with my Dad showing him how I work and talking a wild game about everything that had my mind so high at that point in my life. I had worked on the system for two months in my spare time and was sure it would work by understanding the pieces very clearly (something I rarely do but the subject matter was just so intoxicating). And yet, the night before our first day at the Boston Architectural Center on Newbury Street, I had not yet put the system together with the genome and was still tweaking the genome before implementing its impact on the L-System. There were many great lessons on system dynamics through that experience and more thoughts about science imitating art (and the other way around) that I still think about redoing that system nine years later. And, I still have the 10000+ plants the attendees created, including many that demonstrate the pitfalls of inbreeding (young kids got very frustrated when the plants did not change after mating the same parents over and over with their offspring).

Applying a shared 3-D platform to ocean science content

Just as the funding was drying up for doing skunkworks in virtual and augmented reality platforms at the HIT Lab, I become better and better friends with researchers in the College of Ocean and Fishery Sciences who were looking at the ocean as if it were one, huge, living organism. They had huge needs in producing visualizations for the Web since the Web was becoming a de facto communications platform of which researchers had to consider taking advantage. I was able to get eight hours of work done in half a day on their behalf in order to work on a 3-D platform that extended the Virtual Playground for use in visualizing ocean and nearshore science data. Although they could not see the immediate value of sharing 3-D virtual environments on the Web for shared experiences among researchers, I could see it and my distinguished colleague Bill Winn actually pursued funding to be able to demonstrate the value of such a platform in the classroom.

We decided to model a twelve by five kilometer watershed that the University of Washington owned and produce it in a virtual world that could be prodded and investigated in the way scientists were exploring it in the real world. The Virtual Big Beef Creek project allowed us to beg the time of first year ecology students in order to watch how they would use a 3-D environment to express their learning to each other with shared visual artifacts. A remote sensing professor allowed us to incorporate multiple aquired satellite and aerial flyover data sets into our software so we could support remote sensing theory hypotheses. We made it a bit too much fun by providing each student with their own personal eagle in which to fly around when they weren't attached to the ground terrain. Some of the males in class begged for a weapon with which to shoot other eagles — so much for mature graduate students. Then again, I am glad they could be so honest. It was clear that young people were coming up through the school systems with the expectation that school time is usually boring lectures and textbooks while fun time is only for when you get home and play your favorite video game. We were breaking down some cultural norms that weren't so comfortable for all attendees to have to deal with.

The project let me dive into thinking about the biological, chemical, and physical processes that take place in the ocean. As the researchers often tried to convince me, I was learning a ton of stuff through osmosis even though my only formal class time consisted of being a technical Teaching Assistant for a remote sensing class and an Oceanography of Puget Sound class. Only when I finally picked up the first-year Oceanography textbook and realized I already knew everything in the book somehow did I fully understand how applied technology projects could really effectively drive life-long learning. They key was to work for fascinating people who loved to learn, loved to share their hypotheses and data with you, and appreciated your contribution even if they weren't completely sold on the worth of it. Little did I know how liberating that project would turn out to be for the rest of my career. I began to hold a point of view that my contribution was extremely valuable through the process by which I approached work and projects. I become confident about bringing all your creativity and ability to see things anew every day without repetition or boredom in order to inspire others to do their best work.

Roll your own graphics engine for ocean science visualizations

As I expanded my interests in developing a platform for a targeted user group, I began to accumulate others who wanted to work with me on such an endeavor. One student was interested in providing a datastream of potential ocean and nearshore data sets worthy of visualization to the virtual world. Another was interested in tying dynamic Web pages seemlessly into the virtual world experience in order to take advantage of both media simultaneously. Yet another student worked with me for two and a half years on improving the Java 3D Virtual Playground base for our work before announcing we could do it way better if we wrote C and C++ code directly to the Open GL graphics libraries. His argument was sound in that he had just graduated with a highly-regarded computer science degree at The University of Washington and had been developing video games within that program in both Java and C++. He was well read in the latest graphic engine algorithms and had demonstrated to me how valuable the new shading algorithms could be in improving our software experience by taking better advantage of the Graphics Processing Unit so that the CPU would not have to work so hard crunching algorithms in software instead of specialized hardware.

The indouin graphics engine was born when we were able to raise money for its development through discretionary funds owned by our local Dean and through applying partial funding from each of our visualization tasks to the task of making a better visualization product through a better graphics engine. I trusted my newly hired full-time peer to architect a very good engine and build it reliably using his strong C++ development skills. I was able to partially overcome my bias towards using open source development methods by realizing we could get enough done in a short period of time to rival the time it took to learn an open source platform and participate properly with the developers. And, still, I dreamt of the day when indouin would be so popular and useful that we could use it as a basis for an open source project that others would gladly join.

I watched the project make great strides and craft well-written code that followed some very good software engineering principles and design patterns that I found myself gaining a deeper appreciation for. I watched with a little concern at how seriously we had to discuss each major architectural decision and then how time-consuming it became to redo that component when a better solution came to light. Participating in such a project was very good for my professional growth, but it wasn't good in that I could not give up the biases I had built up for rapid prototyping and skunkwork demonstrations — biases that must have bothered the coders who worked so dilligently on the core of the engine. Perhaps I didn't engage fully to the point of being transparent in how I felt about working on emerging technologies — I wanted everything to be fun and radical and swashbuckling every day. Instead, I found myself hacking away at prototypes that were deemed less useful because the graphics engine shifted beneath my feet. The day came when the lead architect admitted that managing the C++ process was just too cumbersome for the limited amount of resources we had for development. We came to the conclusion that Java had matured in the couple of years we had walked away and that we could go back to Java development for the benefit of flattening the learning curve and better integrating base modules that others were developing within the Java language (and the Java Open GL-compliant library named JOGL). To their credit, they ported the code back quite quickly. I'll never forget the feeling of living through that sense of getting old in one's beliefs while trying to work with an enthused team of younger people who were deserving and poised to run with the ball.

A GeoWall implementation for experiencing Virtual New Zealand

Six months in New Zealand became possible through the extension of our Seattle lab into the University of Canterbury. While I worked on my own research in an truly international and collaborative setting, I was tasked with working on infrastructure projects while helping exemplify the culture of our lab in Seattle that made it such a desirable place to work and learn.

I chose to build a GeoWall and create a basic Access Grid node to test out with Australian partners. New Zealand is a marvelous country for "tramping about" (the term for hiking down under) and I was excited to get to know the terrain since I was going to start knowing her more intimately very soon. Kiwi GIS experts gave me lots of great data sets to choose from and it took me a single day to build a set of tiles for exploring the whole country. The next step was to get a polarized light screen and then callibrate two of the lab's midrange data projectors onto the screen so they could overlap two images of the New Zealand geography on the specialized screen. I had heard about how innovative New Zealanders had become by being so remote from the rest of the world. I figured I could build a couple of simple shelves off my office wall in which to house the projectors and could drape the screen over the window at the other end of my office. My host, Mark, provided me with two pieces of polarized filtered glass which I oriented at 90-degree offsets over each of the projector outputs.

I became enthralled at exploring New Zealand in 3-D with the full depth effect provided by the first rendition of a GeoWall kit. I downloaded other data sets that were made specifically to emphasize the power of true 3-D parallax in exploring world geography. My favorite became the view of world earthquakes at depth, but there were lots of neat visualizations we adapted for the GeoWall including an object-oriented code library visualization (done with a ball and stick kit usually applied to chemical molecule visualization). The resident video gamers quickly hacked a couple of popular first-person shooters to run on the GeoWall so playing fast-action games could be more immersive to our frontal cortices. I quickly realized how perfect a new technology toy could be for diving in to meet a research lab's worth of brilliant and motivated programmers and artists. The Access Grid work was no where near as exciting as it involved coordinating remote partners that I could have just as well done in Seattle. Besides, we had to pay for off-country bandwidth in New Zealand (8.9 cents per megabyte) — something that did allow me to feel the remoteness of the southern island as I was not able to run all my shared 3-D environments as often as I would have liked for keeping in touch with friends around the globe. But, the Access Grid work lingered onwards with one lab researcher who continued to build their node into a top of the line, full-featured node (complete with awesome echo cancellation hardware that became the recommended set up for future nodes around the world). As I got back to Seattle, I appreciated knowing that I left a grid node behind for keeping in touch with such wonderful people. And, of course, it is true that the tramping in the real world down there was much more memorable (until the day I die, for sure) than the virtual experience, no matter how engaging, but hey, I could take the virtual experience with me back to the GeoWall set up in my office in Seattle to torment my urges to be outdoors in a bearless backcountry from time to time.

An integration platform for earth science simulation modules

The Virtual Scalable Basin project was born when a University Initiative Fund provided funding for projects that bridged academic departments on the UW campus. Creating a Puget sound Regional SynthesIs Model (a PRISM, so to speak), was an obvious project to put together to bring the geology, atmospheric sciences, ocean sciences, geography, fisheries, urban planning, civil engineering, and computer science departments together around a glorious goal of simulating the behavior of the physical Puget Sound for prediction and understanding purposes. I was brought on board to represent the ocean sciences visualization community but quickly got mired in dealing with organizing the data sets associated with the project. Hundreds of useful data sets were in the hands of the wide swath of researchers across campus — researchers used to competing for funding through the alure of their data sets that could be expanded with the promise of future funding. The data sets housed data in a wide variety of Earth projections, data formats, and time steps. Fishery folks looked at the world in six-month stretches, atmospheric scientists in twelve-hour stretches, and geologists in million year eons. They thrived by a career of building deep insight at their particular temporal scales — how would they fare at speaking to other researchers at temporal scales outside their comfort zone? Well, I suspect we did as well as any big science campus would, which was marginally so-so. An interesting issue that arose with my specific programming work was a by-product of our academic system. We push PhD students to work in their own bubble to prove they are capable of generating their own ideas and able to implement their vision on their own with a little advisement from a faculty committee. The hydrology modeling code that I had to become intimate with over two years was a hodge-podge of sixteen different PhD candidates who had used it to demonstrate their moxie by getting it to model some aspect of physical water transport in varying environments (the environment of huge snow pack being a favorite in our part of the world). They each had generated fantastic results that demonstrated the value of a core physical model of water, but they all had slightly different versions of the code that were just barely kept in a semblance of order by a brave and talented couple of souls in the civil engineering department. Of course, having done the work, they were quite marketable in the open market and were hired out at double their academic salary (but without many of the great benefits of being on a campus, in my opinion). New brave and talented souls would come in and have to learn the hodge-podge anew. I had the interesting opportunity of doing my work during one of those transitions.

My approach was to find a good module wrapping process and to wrap the modeling code into various modules that could be pieced together in sequential processing steps. This brought me in touch with the fabulous Python programming language community who taught me how powerful a intuitive scripting language could be for making a complicated computation environment more palatable for new users. I enjoyed working with the Python visualization libraries so that I could make a hydrology modeling run into a process of flashing colored lights and data transport animations. The work was very satisfying for me to be able to see a complicated computation process in action when before it was a series of boring textual outputs to a terminal. I chose the OpenMI modeling standard for connecting the wrapped modules together to do meaningful work. That choice got me invited to a workshop in Munich where the greater modeling community presented and discussed wonderful models that had been developed across country borders in Europe. A model of the Danube crossed many borders. Models of Greek rivers bridged the natural channel modelers with the man-made culvert and sewer modelers seemlessly. But, the OpenMI standard was specifically designed for dealing with complicated political climates such as the evolving political norms within the European Union.

I smiled as I knew complicated political climates were quite alive and well at major universities with a long history of successful silo-based research within tightly defined academic departments. OpenMI was a pleasure to work with and I learned oodles about first-principles physical hydrology modeling and atmospheric forcing. But, I probably learned even more about people and the opinion of Europeans towards Americans at the time of the Iraq War breakout. As we introduced ourselves at the Munich workshop, I was at the end of the line and I decided not to mention I was an American attendee. I was the only one and preferred to listen to various comments about "Americans and their toys", clearly suggesting that we built huge computing clusters for our own use without much interest in metadata-driven collaborative modeling processes as those absolutely necessary in Europe at the time. What could I say? We didn't have the history of strong borders between states (although we do have our different state funded GIS communities that create some mighty virtual political borders). And, besides, we had two Iraqi attendees that we had to be especially sensitive to — they shared the same love of modeling hydrology other attendees had, and they were home to the two majestic rivers on which civilization had first evolved in the fertile cresent. Bottom line was I got a better offer for an area of research for my PhD dissertation (the industrial engineering department was not part of our PRISM project) and handed off my responsibilities to a very capable peer with a better sense of those huge computing clusters that were becoming more available to our project team.

A visualization pipeline for daily earth science data visualizations supporting a Virtual Puget Sound

I immersed myself in a problem so many information scientists are dealing with these days: How to manage and make sense of all the scientific data being recorded by every possible technology known to man. Ocean Scientists are embedding sensors everywhere these days: At two kilometers depth in ocean bottom basalt, at 50 kilometers height aboard Earth orbiting satellites, and at every place in between. Our humble group of so-called environmental visualization experts got a chance to sip at the proverbial firehose with the freedom to visualize any data sets we found useful for deeper understanding among a wider audience. We used the IRIS Explorer software package to create visualization pipelines for myriad scientific data sets. Each output a potential visual layer for inclusion in a virtual Earth — most prominently for greater the Puget Sound watershed. I continued to learn everything one would want to know about data formats, temporal and geospatial projections, and data resolution and interpolation. I wrote code to encapsulate recurring manipulations of the base data so that we could do the same calculations time and time again. I wrote code to encapture visualization techniques such as color maps and color scales that could be applied to a wide range of data themes such as precipitation, wind, salinity, current, temperature, and snow pack. But, mostly, I got the opportunity to learn what techniques my peers and colleagues were using and encapsulating themselves. The output of our work was very rewarding when included in a graduate course on remote sensing or hydrology. Even more so when we were congratulated for making a difference for a scientific paper's prominence or for helping win a grant with such a clear and impressive visualization of potential results.

A simulation engine for visualizing community-wide emergency response behavior of first responders

I had been watching all the exciting virtual Earth projects coming online. Contributions from Google, Microsoft, and NASA came at amazing clips thanks to the power of competition among competitive organizations (Google and Microsoft) and the pride of the self-proclaimed domain experts (NASA). I was experimenting with laying our visualization layers from the visualization pipeline project on top of the virtual globe. Fun stuff. Then, a flood of potential research money hit the market with the aim of better preparing communities for potential community crises (both natural and manmade). What better place to do such work than in an epi-center of earthquake activity? Not only was I able to run up the learning curve on the NASA Java World Wind virtual Earth platform, I found out a student and colleague of mine was joining their code community as a paid consultant! With insider information and a beautifully motivational blog from Patrick Murris, I dove into tracking people, vehicles, and medical supplies in Seattle, Detroit, and Christchurch, NZ via animated data layers on top of the NASA Java World Wind virtual globe. My first contribution was an architectural diagram of the possible components to use when driving the simulation and communications between role-playing participants. I iterated upon the modules I had experience writing and found great joy in collaborating with talented peers to work on those modules with which I had less experience. The greatest contributions came from a mathematical modeling expert from Turkey and my long-term lab systems administrator who apparently still writes code in his sleep. The work has been successful enough to drive my dissertation work satisfactorily. A ripe area for studying distributed cognition and visualizing cognition among people along with a geospatial and temporal visualization of a community-wide crisis. I got to rub elbows with many people who follow the money when it comes to doing research. I thought I was just following my PhD advisor since he asked me to get involved. But, he had plenty of opportunities to continue his life work without having to divert tracks.

A virtual surround panoramic walkthru of my local watershed

I had always told myself that working with electrons was way more sustainable than working with whole atoms (the whole bits, not atoms thing). Committing myself to bicycling as a transport method seemed green too. Cycling seemed to help me connect better with all the virtual data displays I was producing. I felt like I had gotten to know Puget Sound pretty well. But, it was done with the help of many car trips and flyovers. Now I was commuting back and forth from Providence to Seattle four times a year by plane and my personal carbon load numbers were not looking so impressive. I started reading about the carbon cycle and its most recent contribution to global warming through leaking of sequestered carbon into our atmosphere. I read authors who really knew their local world with enthusiasm even though the reach was a mere ten kilometers from their home. I began volunteering with the Woonasquatucket River Watershed Council to know my watershed more intimately.

Then I went to a Rhode Island DEM-sponsored GIS workshop. I devoured the data layers, focusing on the near extent to where my home is located (soil, elevation, land use, etc.). When I loaded the watershed layer, I realized my home was two and a half doors down from the ridge of the Woonasquatucket! I actually lived within the drainage basin of the Moshassuck! I felt so ashamed of underserving my local watershed, I immediately contacted the director of the Moshassuck watershed council and told him I wanted to do a full documentation project of the watershed. He was enthusiastic and, better yet, knew the watershed inside and out from years of walking it on a daily basis! I built myself a homemade circular camera stand, complete with perpendicualr levels. I notched off both 18 degree and 20 degree increments in the circle with which I would take surround panoramic photographs and stich them into available virtual photographs on the Web. I walked all three tributaries of the Moshassuck watershed and took surround pictures at regular increments. I enhanced the existing watershed map to include my locations and provide links to the photographs. Having had the experience of both the intimacy and immediacy of being behind the camera lens and getting to know an underrepresented watershed as incredibly beautiful and life-providing, I felt like a different person. Something had clicked and my world became intentionally smaller while my heart got unintentionally larger. There is a path to personal spiritual fulfillment in such a process, I tell you.

A virtual Earth visualization platform for investigating the path to the sea from anywhere on the planet

Today, my main work project focuses on trying to entice others to know their local environment within the context of the whole planet so they can think visually about their life and how to help conserve the planet that gave and supports their life. Check out the Watershed-to-Ocean Initiative website. I am happy to report that I finally found all the Canadian watershed networks to go with the US-based watershed data sets I've been incorporating over time. Between the two countries, I am honing in on allowing you to click on over 98% of the land mass and see the path to the sea. Eventually, through a community of collaborators, we'll have data for the whole world and the opportunity for each community on Earth to review and maintain their stream networks to be current. Mass wasting events like landslides can actually change stream networks. Human water diversion projects change stream networks as well. Anyway, this project falls under the category of life-long interactive projects of significant interest and occasional funding — integrating well with the Watersheds Project introduced below.

Running a non-profit Watersheds Project organization

As of January 1, 2011 the Watersheds Project is its own non-profit organization supporting development through fiscal sponsorship. That's one way of saying I'm able to take tax-deductible contributions from those who believe in the project. The WsP, as I abbreviate it, attempts to build awareness and personal action for all citizens to be able to make contributions towards watershed quality in their local geographical region of the world. The WsP starts by engaging citizen support in watershed modeling initiatives using software that has been generated and maintained by a insightful and competent community of hydrology modelers worldwide. Teaching watershed councils and water keeper alliances to integrate long-term watershed modeling into their planning and implementation activities should improve the quality of watershed support.

The unique approach the WsP takes in supporting the watershed community through a content management system managed through social media should allow the whole community to be more effective in taking responsibility for the quality their local natural systems while building a responsible economy and ecology based on the valuable assets the natural world provides residents. The shape the support tools and techniques take will depend on the participation and leadership of each watershed participating in the WsP approach. Best practices and consultations will be shared by the WsP as if a beneficent dictator that learns as an organization from working with motivated watershed stewards. By organizing tools and techniques on behalf of all the world's watersheds, we should have an easier time aggregating models and visualizations of model outputs for a larger world view of the significance of watershed management.

As a director of a non-profit organization that enacts its activities through the WsP, officially a project of The Ocean Foundation, I finally get the opportunity to lead through example given the best of all the examples I have learned from in the projects above. I have had a hard time laying down the stakes in the sand I was willing to call home for a significant period of time. Perhaps it is a sign of aging that I enjoy my home and community more than enough to imagine myself living here for the rest of my life. I fully expect that feeling to flow over to the WsP so I will be directing this project (and all the various sub-projects that come from such an over-arching enterprise) for a very long time, if not the rest of my life.