Friday, April 1, 2011

Microsoft - A Story of an early Compiler Company

Long ago, back in the early nineteen eighties, there once was a new technology company that built compilers, this company managed to build a compiler that had one trait that was better then all the others, it had a superior implementation of a floating point operator that allowed it do better precision math then the other off the shelf compilers. Since this was good, lots of people used it.

A big company with deep pockets asked the company that was good at making compilers to build a simple text operating system for it, and so the compiler company built DOS for deep pockets. This arrangement worked out so well for both of them, they formed a short lived partnership to make a graphical operating system together.

At about the time that the first graphical operating systems from this partnership (someplace in the very early nineteen nineties) came to market, the compiler company wanted graphical applications to run on its new operating system, and so started to port some standard office productivity tools from the text based world to the new graphical operating system world. Office productivity tools had become fairly standard already, so they began making a word processor (the text based standard was word perfect) to work in their new glossy graphical computer interface. So they made Microsoft Word and in doing so created a Macro Language (based on their “Embedded Basic Language” that they created for reuse in various Applications) called WordBASIC to allow the Users of the word processor to automate tasks (after all this was a compiler company).

The compiler company then used the WordBASIC macro language foundation it had built, and ported this to a “full” compiler called VisualBasic. This new compiler had a new idea in it, to support drop and drag “Visual” development, and so the compiler company partnered with a small start-up company that made a “form generator” and incorporated the visual form generator into the final compiler. This new compiler had menus that were trivially easy to make, differentiating it with the thousands of lines of code required to handle menuing usually, so this compiler was used to make millions of applications, because it’s menu system was so easy.

It seems to me, that the softies that invented VisualBasic, had little intention of its use becoming so predominate, but that the predominants of the VisualBasic language came from its menuing system being so easy to use.

So around 1993, a set of technologies was beginning to form around Microsoft. Their DOS and IBM partnership had born fruit in the form of a new graphical operating system called windows and was shipping as Windows 3.1 which had network support. They had partnered with an innovative database company and ported the UNIX Sybase database engine to their 16 bit environment, and in the process were learning all sorts of interesting things about Client-Server development which led to the development of protocols that eased task of connecting to a remote database resource called respectively ODBC and DAO (which eventually became ADO and ADO.Net, after passing through a transitional period known as RDO).

The compiler company now had a network capable operating system, a full set of Office Productivity Tools (with embedded macro languages), a couple of decent database engines for storing data, a new “open” protocol for accessing data in databases, and a compiler that made it fairly easy to build forms that connected from from the client computer to databases, and other external resources. This setup, even though no individual tool was better or more innovative then other applications on the markup, taken together made the lives of business people easier.

The total was so much greater then the sum of the parts in fact that millions of custom applications were made for that platform by a widely divergent groups of individuals with widely varying skill sets.

This is one of the great plateaus of Information Systems, as this configuration has remained largely intact to the present, and looks capable of continue to stay in roughly this same configuration in the business world for decades.

Tuesday, September 14, 2010

The Innovation of Sharing

Social Network, a disruptive new innovation, is starting to shift people’s online behavior, even to the point that the meaning of "online" has even shifted from web based to mobile based.

The new world wide web will be not only based in mobile, but it will be built with applications and tools that put both people and peoples associations at the center of all activities.

Instead of content, you will have posts, and those posts will be "points of sharing", and that interplay between points of contact, "nodes" will create social network "ties."

Facebook is without doubt not only the market leader as the most used Social networking site, but also is the leader in crafting a platform and set of tools to take advantage of the new opportunities on offer in this new ecosystem.

This has rather large implications for builders of internet tools (previously web sites) and for anyone engaging in commerce on the web. The way people desire to interact with commercial offerings is changing.

Monday, September 13, 2010

Sun Microsystems - Goodbye and thanks for all the UNIX

Sun Microsystems is no more, having been purchased by a company (Oracle) that seems intent on capturing "open" software and hijacking that software for short-term gain. Certain aspects of the Oracle deal just point to Oracles acquisition of first InnoDB (the storage engine that MySQL uses) and then MySQL itself with the acquisition of Sun Microsystems, as a method to hijack the MySQL open source project.

But Sun’s founders, were amongst the creators of FreeBSD (through Bill Joy’s work on BSD UNIX), and Sun was always the strongest large Corporate proponent of "Openness" in software directly creating Java, MySQL, and having strongly contributed to the vision that produced Linux (through open UNIX work), and PHP (which owes at least some of its inception to Sun’s java server pages). The whole LAMP stack traces at least some of its roots back to Sun (Apache owning the least to Sun, but you get my meaning).

I think we are not taking account of the large position that Sun had as an intellectual proponent of openness in software. And I for one will lament its passing, especially as Oracle starts using Sun Intellectual Property as a patent-troll to quash innovation in the space.

Computer History Museum - The Facebook Effect with Mark Zuckerberg

An area of the web that has, for me, consistently a very high level of quality of discourse about computer engineering is the Computer History Museum. Its YouTube channel may be found here, http://www.youtube.com/user/ComputerHistory/videos.

I enjoyed this interview with Facebook founder Mark Zuckerberg, found here http://www.youtube.com/watch_popup?v=_TuFkupUn7k&vq=small. I found it refreshing to hear Mark (I can call him Mark, because he is one of my facebook friends), talking about the Innovation of Sharing. He tried to discuss the potentials of "sharing" over a long time frame. How this would impact the software ecosystem, and how it would create new business opportunities. Of course everyone wanted to talk to him about privacy, which was much less interesting.

Sunday, October 18, 2009

Narratives in Programming

“...The programmer, like the poet, 
works only slightly removed from 
pure thought-stuff. He builds his 
castles in the air, from air, 
creating by exertion of the imagination. 
Few media of creation are [...] so readily 
capable of realizing grand 
conceptual structures...”
       -  Fred Brooks

Saturday, October 17, 2009

OOP Semaphores
Byline: Smoke signals from one developer to another.

I would like to point out the very obvious, that too often gets lost in the shuffle.  The code choices we make when designing components act as a series of “signals” that we send to the next developer working on the library or to the developer who is applying and implementing the objects we have created.  When we are crafting our code, our choices of class and parameter names, member protection levels, and constructors act as the primary method of disseminating the indented usage of our code.

If an organizations code base is the most inclusive repository of business process knowledge of an organization, then our code libraries supply the nouns, verbs, and adjectives of the business process.

Name for global business understanding
Most business processes have a generally accepted internal name.  This is the name that the stakeholder community will call something when telling their business story.  Using a different name, as in the statement, “to me blank is this,” when your user community has consolidated on a defined designation, is obfuscation, and to me, smacks of hubris.

Sometimes, you must rename, for elucidation, like when swapping out an acronym or abbreviation for a “Plain English” term that more fully explains the intended usage.  But this means you will have to disseminate the new term and usage throughout the hierarchy you are involved in.

One of the best reasons I can think of to develop documentation artifacts before the creation of a code base, is so that you have a more complete set of thoughts on the business processes, so that the names you choose will be indicative of the actual use of your objects in the wild.

Code the meaning of your Intended Usage into your implementations
So lets talk about the following code:
  
 Fig.1  
    interface iAnimal
    {
        string Name { get; set; }
        object Vocalize(object SpeechType);
    }



The Plain English meaning of this code is that all objects that implement iAnimal will have at least one string type property called Name, and a method called Vocalize that takes a parameter of type object and returns an object.  By creating an interface (a code pattern that set a base contract of what will be contained in an object) the architect of the code, signals to other human beings implementing his work, that this is the minimal acceptable level of code to implement this functionality.  If you implement from this interface, the assembler will only compile your object, if your code object contains all the members specified in the interface. Those the contextual meaning of the code is “you must have all these pieces.”

Fig.2
    public class Animal : iAnimal
    {
 private object _vocalization;       
 private string _name;

        public string Name
        {
            get { return _name; }
            set { _name = value; }
        }
       
        public object Vocalize()
        {
            return _vocalization;
        }

    }


Figure 2 shows the terms of the code contract being met in full. So we have a cycle of intention and usage, playing out into fulfillment, Contract spelled out by the interface, and then Contractual Obligations met in full by the implementing class.

“What I would like to stress here, is this is not just a series of assembly instructions, this has inherent intended meaning from one human being to another”
So how about when a developer sets up a class as abstract, the Plain English meaning is that the object must be inherited to be used. The compiler will not build if you try to call new on the abstract object.  But the contextual meaning that the code’s architect is attempting to send to the implementing developer is that the architect’s intended usage of this code is as a base class that is meant to be inherited.

Sunday, July 12, 2009

Planning for Risk

Probabilistic risk assessment, and benefit analysis. Probabilistic risk assessment is a methodology to evaluate risks associated with a technology endevor.

Risk in a Probabilistic risk assessment (PRA) is defined as frequency of a hazard occuring from an activity or action.

In a Probabilistic risk assessment (PRA) , risk is characterized by two quantities:
  • Severity: the magnitude of the possible adverse consequence.
  • Probability: the likelihood of occurrence of a hazard.
Probabilistic risk assessment (PRA) answers the following three questions:
  • What can go wrong with the studied technological in question, and what are the initiating events (undesirable starting events) that lead to the hazard (adverse consequences)?
  • What and how severe are the hazards (potential detriments), or the exposure (adverse consequences) that the technological entity may be eventually subjected to as a result of the occurrence of the initiator?
  • How likely to occur are these undesirable consequences, or what are their probabilities or frequencies?

Risk Analysis
HazardSeverityInitiatorProbabilityExposure
1Project Time OverrunModerateLack of or incorrect Project ManagementHighSoftware not released on time
2Cost OverrunModerateHazard 1HighCosts Skyrocket
3Crash Project - Additional Resources allocated. New team members are not properly trainedHighHazard 1, after time overrun is identified new resources are allocated to projectHighCode base becomes higly unstable
Additionally your full risk analysis should include a cost-benefit analysis, as well as opportunity costs (strategic), with a measurement of the risk of benefit shortfall.

Strategic Planning is a set of plans that take into account other players responses to events as a plan unfolds. This is different from a process plan that outlines in linear form the steps to achieve a goal.

Wednesday, March 25, 2009

How can Business Objects in Code help us tell stories?

A business object can be set up to offer our code structure an actor. We should endeavor to keep our actor and its associated UML User Story actions from our programmatic plumbing, the functions like "save to the database" and "serialize to xml."

Encapsulation tells us that we should engage in data hiding practices, so that programmatic plumbing functions can be held in base classes and that functionality can be inherited, keeping the implementation details out of our User Stories. Once we get the implementation details out of our utilization code, we can get closer to representing the User Story directly in our.

Personally I think this would go along way in keeping our thinking from getting muddled and also saving us from spaghetti code.

Sunday, December 28, 2008

Software Engineer Code of Professional Conduct

ACM Code of Ethics and Professional Conduct

Software Engineer: Phases of Professional Development

Software Engineer Developmental Rubric

DimensionJr. TechnicianMid Carrier LevelTeam LeaderArchitect
Training Learns job requirementsTeaches requirementsTeaches Technical ConceptsDisseminates cultural values
Operating ProceduresTold what to doStandard Operating ProcedureBest PracticesPatterns and Practices
StandardsWrites code, waits for others to testTest functionality, will settle for working functionalityOversees build, replaces work that is substandardCreates implementations of concepts
RequirementsOnly works on requirements that are supplied.Gets requirements from stakeholdersIntegrates stakeholder requirements with scopeIntegrates requirements, scope with systems architecture and corporate vision. Sets Technical Vision
VisionDisregards vision, too distracting, too many other things to worry aboutDisregards Vision, believes it is just another stupid HR initiativeUses vision to generate excitement.Set vision goals to expand the software community ecosystem and incorporates altruistic agenda.
Value Systemdoes not knowpersonal ethicsteam ethicscommunity ethics
Relationship between Effort and ResultBelieves others are "smarter"Believes excuses, if only everyone else would do something then this would work.Takes responsibility for outgoing work.Integrates contemplation of software concepts into leisure activities. Incorporates professional growth into Personal learning and private conversations.
Programming StyleIn linesome encapsulationclass hierarchiesAsyncronism processing, prototyping, Interface building
TestingExpects others to testAd-Hoc TestingTesting Against use casesAutomated Testing Suites


ref: Appendix A: Memetic system for the historical development of developmental research; Dr.D.K.Dirlam

Software Engineering Culture, Shared Cultural Traits

I have given some thought to traits that are common within individuals in the software culture.

Common Cultural Properties:

  • Describes learning as exciting
  • Caution - Pessimism
  • Long Work Hours/Personal Responsibility - Doing anything it takes to get the job done.
  • Strong Personal Sovereignty
  • Anal - Process oriented
  • Oriented towards knowledge acquisition
  • Prone towards empirical experimentation
  • Skeptical - proofs require measurable result

It is easy to see that when asking a knowledge worker if “learning is exciting?” All will know the only acceptable answer is yes. Not only would their peers would make fun of them (peer pressure?), but ansering in the negative during an interview will most likely cost you the job.

The cautious streak comes from the fact that we work in environments filled with possible problems with implications that can get us fired. If someone foolishly destroys the database server and all staff and all customers are not able to perform work related functions for days, there can be quite severe repercussions. We learn caution from getting burned a couple of times.

On project deadlines we all ramp up towards the end, crashing project timelines, leading us to trying to get things finished by the imposed timeline, which does not represent the amount of work that needed to be done, but the desired time to finish the work. We get used to very heavy workloads, and very late evenings. Most of us develop mitigation strategies like “work from home, ” to offset this. Even still it is common to work until 2am at least 5 times a year, in even the most relaxed environments.

I think a large part of the personal sovereignty comes from the monetization of the software engineer skill set. The negotiating leverage is useful far beyond acquiring money and benefits. After a knowledge worker achieves their financial goals, they can use this leverage to have a say in the development of the environment they work in.

Software engineers are process oriented as a group. The work attracts Type-A people and rewards “anal” behavior.

Knowledge acquisition is rewarded in our culture with more money, and career advancement, this positive re-enforcement causes this trait to be common amongst member of the knowledge worker community.

When dealing with complex problems that need to be solved I have heard, “when the going gets tough, the tough get empirical.” To ascertain the root cause of a wickedly complex issue (a quandary all software engineers find themselves in) empirical methods to test hypothesis is a necessary debugging tool.

Our skepticism and desire for measurable truths results from the relationship software engineers have with the business. In bad environments (you know we have all been in them) we work with the subjective emotional experience of “The Business” and try to convert that to testable rubrics and machine instructions.


Friday, December 26, 2008

Software Engineering Culture, a little background

The knowledge worker culture as we know it today began with the acceptance of the UNIX (AT&T) operating system in the 1980’s. The software language C(1) that was built to run the UNIX environment is the root of our modern systems.

In the late 1980’s modern database systems were completed, and added support for a common extraction language (Structured Query Language, SEQUEL developed by IBM) and support for Open Database Connectivity (developed by Microsoft) was added in the very early 1990’s. OS2, a joint project of IBM and Microsoft, which morphed into the Window NT family was started in the late 1980’s and released in 1993 was the final fundamental step that completed the move to the modern computer era. By 1995, with protocols like TCPIP, HTTP, OLE we had a platform that significantly resembles the modern age of software.

Most of the people that I know in the industry spent 1989 through 1996 developing data retrieval code for Windowed based GUIs. From 1996 to about 2001, 2002 the industry scrambled to implement web sites and web applications, with out much knowledge about their tool sets. So from 2002 to today, we have been cleaning up the mess we made in the late 1990’s.

So when people talk about the pace of change in software, I am taken aback. We have had two decades to get used to the current environment. The last big change in the industry came with Microsoft’s release of the .NET framework in 2001(we'll talk about SOA later, it came before this, and does have a large paradigm shift grade impact), and that was a shift that took the code base back to the Object Oriented Programming Ideals(2) of C. What .Net represented was not so much a rethinking of software, but a nanny system that forced bad programmers to correspond to established best practices.

The problem with having an idea of the changes in software is that the code changes from year to year, as better version (revisions) of software are released, not that the paradigm of development has changed. Add the fact that computer systems have immensely complexity, to this and it could appear that every thing has changed, when we are only discovering what we did not know last year (but really should have).

So, that almost unanimous statement that “Change” is part of IT culture is something that I always wince at, when I hear it. The only universal changing thing in IT is knowledge-workers abandoning their respective employers for better opportunities.

So what is the Zeitgeist of IT Culture, what are the traits that we share? Unfortunately I will need to weed out those people who write code as only their job, and talk only about Software Engineering Culture.

  • Need for constant process improvement. “If I am not improving things here, I can just go somewhere else that I can make a difference.”
  • Control over their own personal sovereignty. “If you don’t like it, I can just leave.”
  • Aggressive negotiating skills. “I got five hits today on my monster resume.”
  • Need to achieve to their own level of quality. “You don’t own quality, we all do.”
  • Aggressive self-study program. “I will implement that new third party software in three days, I will learn it on the fly.”
  • Consistent desire to implement new ideas. “Check out this recursive function I made!”
  • Desire for a comprehensive solution to problems. “We need to fix the underlying problem, not just patch the patch.”


(1). C was a class based language with support for functions, it spawned C+, C++, and Java and is the basis for Microsoft.NET
(2). With some very cool API features and Visual Studio 2008 has a great code editor, but it still is a direct descendant of C.

Learning Culture in Software Engineering - Preamble

So I have been at this for over two decades now, and have been close to software development for 26 years. During that time, I notice a pattern, which has been 100% accurate. All software engineers who are very good at their job are dedicate to life long learning. They all have a robust self-study program that they control, and allows then to master new ideas.

Often they are mistakenly thought of by others as being smarter. But in truth, they are simply diligently working a self-study program.

A self-study program is important because it trains the software engineer in acquiring new information quickly. It helps train the individual in the art of making informed assumptions that speed up knowledge acquisition. It shows the individual the size of knowledge chunks that they can consume, and through repetition trains them to be able to absorb larger information groups. The self in self study is important because it trains the individual in being able to go after information and knowledge on the individual’s own time and to correspond to the individuals life flow.

So if one believes that repetition leads to enhanced skill uptake, then following a self-study program would improve an individuals overall learning capacity.

Learning and being smart are core culture traits of the IT industry. This culture of learning in IT is nourished by Higher Education Establishments, Large Software Venders and by individual engineers. It is a common thread amongst all knowledge workers.

So we will want to look at this cultural trait a bit to understand it.
How do you foster and grow a culture of “life-long” learning?
  • How does “life-long” learning benefit you personally?
  • What value does “life-long” learning bring to organizational structures?
  • How is “life-long” learning taught and passed on to incoming members of the community?
  • Who makes their resources freely available and what benefits does that bring?
  • Where does “life-long education” fit into “life-long learning?”

Wednesday, June 18, 2008

Memes as Class Objects

I am interested in the idea of the Meme, because it appears to behave in a manner very similar to a Class. A meme is a container that may contain any amount of meme inside of it. A meme as a knowledge unit suggests a discrete action that it accounts for. A meme may have properties.

A Meme, as a transferable knowledge unit, would consist of at least one noun and one verb. This would correlate to properties and methods in OOP.

It would appear to transfer well as a method of capturing work-flow into an object oriented class structure.

Sunday, June 15, 2008

Memes = Knowledge Units

In business, knowledge units are routinely passed from person to person as people shift and change positions. Many people in an organization have knowledge about the functions that a person performs in a position, from the specific to the general. Most often position specific information is passed from position occupant to position occupant. During the 1970’s the study of the actions of a group of business people was called “System’s Analysis.” Today much of a IT group’s understanding of a business knowledge is gained through “Use Cases" or "User Stories."

The User Stories are have more to do with the desired solution, then with what is currently happening, so that the shape of “System’s Analysis,” which was the documentation of current procedures has changed somewhat over time. As now we collaborate more on what we want to happen, then on what happens.

A partial understanding of what is currently happening could be gained using the sociological tools of memetics. Memetics is the study of transmittable pieces of knowledge replicated primarily through imitation. Lots of work flow analysis may benefit from capturing the information that is transmitted from position holder to position holder as natural turnover happens in the enterprise.

A meme is a unit of knowledge that can be transferred between people, usually through imitation. Transfer of knowledge units happens with replication and propagation. Knowledge transfer through an organization has a transformation vector. Knowledge vectors have a tendency to cluster and happen together or "herd", this is memetic association.

A knowledge unit can morph between propagations, not unlike a game of telephone, this is reflected in memetic drift or the meme's copying-fidelity. A meme is thought to have memetic inertia if its characteristics are manifested in the same manner, regardless of who receives or transmits the meme.

A knowledge unit may not always be health and helpful, this is reflected in the Meme's Fitness. A knowledge unit that does not allow another specfic meme to exist is an Allomeme, a mutually exclusive cultural trait.

A meme's rate of replication and therefore its spread is the meme's fecundity. The longer any instance of the replicating pattern survives, the more copies can be made of it, this is the Meme's longevity.

A cluster of meme's is known as a memeplex. A collection of memeplexes is know as a Deme.

As a software developer a portion of my job is to capture the processes in an organization. There are most likely great tools out there to do this, I just don't know about them. So I spend time wondering about how knowledge is transferred around the office, so that I can take my butterfly net and capture it for study so that I can turn it into "machine Instructions."

User Stories are great, but they don't get deep enough sometimes in the processes of knowledge. So I use "User Stories" but think about the propagation of knowledge in the enterprise.


Class-Responsibility-Collaboration hierarchies in databases

Database Structure is the rules of how an organization's data interacts together. This may seem obvious, but if you look at designs of databases and development, you see that the obvious is often ignored when in context. The database structure is the Meta Data, the data that explains data.

An organization, from where I sit in the back office “virtually” consists of a network skeleton that reaches across to nodes, and or satellites. A backbone of heavy iron churns that data and runs the processes, launched by request from workstations.

Usually an organization will have one or many databases that act as the repository of information in an organization. The purpose of this data is to monetize the information or processes for profit.

The way to think about the database is not to view it as a dog-pile of stuff, but to think of knowledge units. Many knowledge units are contained in the database and linked through processes and/or relationships.

Knowledge units are first understood through the analysis of existing (or new) business practices. This often is done through “Use Case” scenarios. These knowledge units are transformed (part of the problem?) into “machine Instructions†” by a programmer.

Knowledge Unit storage in the database, is the combination of data and the applied rules to the data in conjunction with the relationships that data has with other data (a variety of rule).

Yet, what so often happens in production of a database design, is too much of the design is set up because, “I need to store this somewhere,” and a table of codes is created. The structure reflects the needs of the designer at design time, but does not take into account that what needs to be stored are things from the real world. These knowledge units are digital depictions of a real world objects or processes. The real world intrudes on the digital with Class-Responsibility-Collaboration hierarchies, but too often the immediate needs of a program result in structures being created on the fly that are nonsense.

So what I am saying is that when we are storing data into a database, we are storing the digital representation of a real thing, a classification or series of actions. To miscellaneously toss this willy-nilly into a hierarchical structure risks moving the whole structure into a game of “Silly-Buggers.”




†Usage for me is anything set of instructions a computer can use to render results, i.e. Table relationships, SQL statements for data extraction, script or compiled code, ect.

Code as "Story-Telling" Introduction

Computer code is a language of expression. The core purposes of a programming language are to articulate rules and actions. This expression of the rules is done by people, who have the purpose of sending instruction sets to the computer in such a way as to get the computer to execute a series of tasks that achieve a desired and mostly likely testable result.

But the life cycle of programs goes beyond the extent of the first publication. Some computer code exists in a modified state for over 30 years. This life expectancy of code impacts how the written instructions should be targeted. It is not just enough for a program to be designed so that it does exactly what was desired, another aspect of a programs is its human maintainability over its full life-cycle.

When developing code, the business processes are laid out in documents, these documents amount to the intention of what a program should do. The industry is getting better at producing these documents and putting together scope. The problem with the scope is when a change request is applied to the original scope the request amounts to an errata of the scope. Over the life of development of an item the original intention of the software is greatly modified, and you get errata, of errata, of errata and so forth. The issue comes into play that not even the documentation of the business process adequately defines what the true in production business process actually is.

I sometimes view software through the conceit of the software being the real story of a business and that reading a businesses’ software is an exercise in exploring an enterprises’ set of practices. This story telling view of software, when I am reading someone else’s programs or writing the software myself helps me to get into a mind frame while I am developing to know that there will be (most likely) many other people who will work on a piece of software before it’s useful life expires.

This view, as it has achieved fullness has led me to be slightly discouraged with the state of enterprise software that I have worked with. On the web, the software instruction set is usually just dog-pile. In-line code directives (line by line processed very similar to an old BASIC program) piled into pages that are not even organized well into directories, without even the use of basic programming plumbing like methods. In all the organizations I have worked at these code repository’s code could be best summed up by naming them “Silly Buggers.”

A large part of the software writing community strives only to get the exact desired result from software. If the desired result is achieved then software is deemed to be successful. This can be summed up best by programmers who are asked about their software and they reply in its defense with “It Works,” which somehow absolves the software from any necessity of organization, well-craftedness, or maintainability.

If software is the one true repository of the business processes of an enterprise, I wonder how a business can afford to have its’ processes stored in a manner that other human beings are not able to readily understand, or replicate. Additionally if the usage of code goes beyond just getting the correct result, but includes that code has a status as both a rules repository and a dynamic living document, then perhaps the way that the code itself is written should be looked at.

Wednesday, June 11, 2008

My History

My name is Gareth Dirlam, I was raised in San Diego County and was a teenager (relatively) next to the hot spot of technology. My father was a computer salesman, and I learned about business system programing and database systems at the age of twelve. I worked in financial departments starting in 1989 when I was 19, doing budgeting forecasts. Until now, I have worked mostly in the insurance and medical industries, besides maybe a year or two.

As a programmer (A name that in my thoughts becomes me best) I moved from script hacking to object oriented design in 1999 with the assistance and mentoring of some excellent engineers. I now write almost exclusively in C# and VB.net when I am on the server side.

As a engineer and system's architect who cares about tiers, I have done a lot of work with databases, and think that the database is a fine place to be. I work in database systems as a database designer, and believe strongly in them. I work primarily in SQL Server.

I am currently employed as an IT director in the real estate industry.

I am mostly a web programmer, preferring that platform for ease of deployment and also comfort with a tool set I have worked with for a long time. I know HTML/XML, CSS, XSLT and Javascript, and use them to improve the overall experience of the web applications I develop.

New Blog

I felt that I needed someplace to start putting some ideas to paper (so to speak) and so I have started a blog.

I will post here about my thoughts on programing, coordination of programing efforts, industry trends (past, present and future) and any aspects of the collaboration game that come to mind.