Monday, July 11, 2011

Rest Service Implementation


WCF Rest-like Service
Proto-type implementation

Vocabulary

Representational State Transfer:  Low level stateless software architecture approach that uses the structure of URI's (uniform resource locator) and HTTP 1.0 verbs to define resultant service data displays. REST outlines how to define, access and address data sources without the need for additional messaging layers or session tracking.
Windows Communication Foundation: WCF is a platform that supports a multitude of different methods to expose data services, in a service-oriented methodology. It is request agnostic, meaning it is not tied to a single request method like REST, or SOAP.
Http Client: Object to initiate a Http Request, previously known as XHR or XMLHttpRequest. This safe for scripting object was used to initiate AJAX calls.

Document

This document is meant to detail the top level technological decisions that were made in the proto-type WCF Rest-like Service implementation. This document, based on proto-type implementation is meant to postulate possible guidelines, as such this document will remain in draft format in perpetuity, and will be superseded by any and all Guidance Documents or White Pages on the technologies in this document.

References - Prerequisites

      WCF Rest Service Template 4.0 for Visual Basic
      WCF Rest Starter Kit (http client)
      Visual Studios 2010, .Net Framework 4.0
      IIS 7.0

Http Verbs

It seems appropriate to start any discussion of restful(like) solutions with a conversation about HTTP Request Methods. These are often referred to as "verbs."  Most of us are familiar with "Get" and "Post" but have failed up to now to notice the other request methods that the http protocol supports. For now, let it be sufficient (until we begin our discussion on routing masks) to say that restful services respond to not only the URI of a request, but also route based on the action that is described via the Request Method. WCF commonly responds to GET, POST, PUT and DELETE for crud operations, there are more Verbs in the HTTP Protocol Specification, but they are not in common usage in Restful services.

Routes

Guideline: Configure your routes in the virtual route table, accessed in code in the Global.asax file in a custom Procedure called RegisterRoutes.  This behavior is similar to the routing mechanism employed by MVC.Net web sites.
fig. 1

Private Sub Application_Start(ByVal sender As Object, ByVal e As EventArgs)
  RegisterRoutes()
End Sub

Private Sub RegisterRoutes()
  RouteTable.Routes.Add(New ServiceRoute("RouteName", New WebServiceHostFactory(), GetType(MyApplication.MyClass)))
End Sub


Application Start

Creating a response to a request in a virtual location (there is no physical file to respond to the request), means that a listener will have to be created.  WCF supplies a factory method to create this listener, with the WebServiceHostFactory class. This factory class should be called at application start,  and added as a Route to the application's RouteTable object.
fig. 2

RouteTable.Routes.Add(New ServiceRoute("RouteName", New WebServiceHostFactory(), GetType(MyApplication.MyClass)))


Routing

Incoming request are passed from IIS to the WebServiceHostFactory, where they are routed according to the RouteTable rules into the class specified in the Routes.Add type argument. The request is then picked up by the Class, and is handled by a method of the class based on the "verbs" and the arguments specified by the request.

 

Route Masks, (HTTP Method and UriTemplates)

The WCF Rest Template offers Common Routing, and templates for the most common scenerios. These are mostly self defining, based on the HTTP method, and also the existance of an id or name argument. It is worth noting, that all arguments passed via a URI will be of type string.
fig. 3

      <WebGet(UriTemplate := "")>
        Public Function GetCollection() As List(Of SampleItem)
            Throw New NotImplementedException()
        End Function
       
        <WebInvoke(UriTemplate := "", Method := "POST")>
        Public Function Create(ByVal instance As SampleItem) As SampleItem
            Throw New NotImplementedException()
        End Function
       
        <WebGet(UriTemplate := "{id}")>
        Public Function [Get](ByVal id As String) As SampleItem
            Throw New NotImplementedException()
        End Function
       
        <WebInvoke(UriTemplate := "{id}", Method := "PUT")>
        Public Function Update(ByVal id As String, ByVal instance As SampleItem) As SampleItem
            Throw New NotImplementedException()
        End Function
       
        <WebInvoke(UriTemplate := "{id}", Method := "DELETE")>
        Public Sub Delete(ByVal id As String)
            Throw New NotImplementedException()
        End Sub

 
Restful services have some common practices, which govern there usage, and these should be adhered to as long as they do not create odiously architected solutions. There is an expectation that reaching the root a Restful Object (in a URL Get), that object will return an unfiltered list of objects of that type.  When an "id" or "name" is specified in a Get on a Restful Object then the expectation is that the Restful Service will return a single instance of the item in question.  Using the HTTP Method, "Delete" means that the requesting caller would like to see that item removed from the underling persistence store.  This is clear, but the Delete could be requested using either a "id" as in "please remove the item with this id from the persistence store" or the object could be sent in a http form, meaning "please remove the item in the persistence store that resembles this item." The Put and Post methods have similar problems, as the user community is not 100% decided on a usage for these items. For the purposes of our discussion, we will assume that POST is equal to a form post in which the requestor desires that we insert the item into the persistence store, and that a PUT is when the requestor desires us to perform an update on an item in the persistence store.

            A Note of Caution: It is incumbent on me to offer a note of caution on POST and PUT, the   Restful service creator should handle PUTs in POSTs and POSTs in PUTs, or send back an    explanation on why the desired result was not achieved.


Complex Routes

Routes that specify multiple filtering items at different levels of the Restful Portion of the Resource Locator, are complex routes.

Complex routes need to be specified on the top hierarchical level. For instance, there is a parent-child relationship and the service should list a collection of children, filtered by the parent id, then the handler method will be created on the parent level.

fig. 4

      <WebGet(UriTemplate := "{id}/Child")>
        Public Function GetChildCollection() As List(Of SampleItem)
            Throw New NotImplementedException()
        End Function
       

Introducing the HtHttpClient

The WCF Rest starter kit comes with a new and improved HttpClient Utility. This new client, like the old XMLHttpRequest Utility allows a thread to programmatically access the web resources without the overhead of a user-interface. The new HttpClient is a major update of functionality, since the XHR methodologies of early 2000 contained no support for Http Request Methods beyond GET and POST and also were not concerned about serialization. Modern Service Architectures have significant enhancements to serialization and do not stop at POX (Plain Old XML, simply returning a document of XML data). The WCF service and the HttpClient come together in serialization of strongly typed data, you may expose your data from the WCF as "exact type" and consume it in your interface as the same "exact type" that you exposed it as. This methodology means that a WSDL data contract is not only unnecessary, but also undesirable, in so far as the WSDL creates a new type that the client consumes.
Additionally the new HttpClient has built in support for "DELETE" and "PUT" Http Verbs, as well as the more common, "GET" and "POST" of XHR.

Usage

The HttpClient is located in the Microsoft.Http.dll, which is included in the WCF Rest Starter Kit. Adding Headers is simple in the
 fig. 5

        Using client As New HttpClient("http://localhost:8080/Route/")
            Using response As HttpResponseMessage = client.[Get]("Action?id=3")
                response.EnsureStatusIsSuccessful()
                Dim anon = response.Content.ReadAsDataContract(Of List(Of ContractLibrary.Interfaces.IItem))()
            End Using
        End Using


Custom Headers

Adding Headers is simple in the HttpClient. Access the DefaultHeaders Collection of your instantiated client and then call the Add method.
fig. 6

        client.DefaultHeaders.Add("CustomHeader", HeaderValue)


Security/Authentication

The Restful service site is secured through the usual Microsoft Web Security Provider, and needs no configuration beyond what is usually contained in a .net web site. Accessing the resources in a secure manner usually means that you will be sending a set of credentials (as system.net.icredentials) in the Http Request of the HttpClient Utility.
fig. 7

        client.TransportSettings.Credentials = Credentials


Configuration

The options specified in the web.config file of the Rest service are fairly straight forward.  Though please note the last WCF section, and in particular the attribute HelpEnabled, which automatically creates an operation description page for each route when the route action is "/help."

Authentication:

    <roleManager defaultProvider=".." enabled="true">
      <providers>
        <add name=".." connectionStringName=".." applicationName=".." type="System.Web.Security.SqlRoleProvider, .."/>
      providers>
    roleManager>
    <authorization>
      <deny users="?" />
    authorization>

Routing :

    <modules runAllManagedModulesForAllRequests="true">
      <add name="UrlRoutingModule" type="System.Web.Routing.UrlRoutingModule, System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />
    modules>

WCF:

  <system.serviceModel>
    <serviceHostingEnvironment aspNetCompatibilityEnabled="true">
     
    serviceHostingEnvironment>
    <standardEndpoints>
      <webHttpEndpoint>
        <standardEndpoint name="" helpEnabled="true" automaticFormatSelectionEnabled="true"/>
      webHttpEndpoint>
    standardEndpoints>
  system.serviceModel>






Friday, April 1, 2011

Microsoft - A Story of an early Compiler Company

Long ago, back in the early nineteen eighties, there once was a new technology company that built compilers, this company managed to build a compiler that had one trait that was better then all the others, it had a superior implementation of a floating point operator that allowed it do better precision math then the other off the shelf compilers. Since this was good, lots of people used it.

A big company with deep pockets asked the company that was good at making compilers to build a simple text operating system for it, and so the compiler company built DOS for deep pockets. This arrangement worked out so well for both of them, they formed a short lived partnership to make a graphical operating system together.

At about the time that the first graphical operating systems from this partnership (someplace in the very early nineteen nineties) came to market, the compiler company wanted graphical applications to run on its new operating system, and so started to port some standard office productivity tools from the text based world to the new graphical operating system world. Office productivity tools had become fairly standard already, so they began making a word processor (the text based standard was word perfect) to work in their new glossy graphical computer interface. So they made Microsoft Word and in doing so created a Macro Language (based on their “Embedded Basic Language” that they created for reuse in various Applications) called WordBASIC to allow the Users of the word processor to automate tasks (after all this was a compiler company).

The compiler company then used the WordBASIC macro language foundation it had built, and ported this to a “full” compiler called VisualBasic. This new compiler had a new idea in it, to support drop and drag “Visual” development, and so the compiler company partnered with a small start-up company that made a “form generator” and incorporated the visual form generator into the final compiler. This new compiler had menus that were trivially easy to make, differentiating it with the thousands of lines of code required to handle menuing usually, so this compiler was used to make millions of applications, because it’s menu system was so easy.

It seems to me, that the softies that invented VisualBasic, had little intention of its use becoming so predominate, but that the predominants of the VisualBasic language came from its menuing system being so easy to use.

So around 1993, a set of technologies was beginning to form around Microsoft. Their DOS and IBM partnership had born fruit in the form of a new graphical operating system called windows and was shipping as Windows 3.1 which had network support. They had partnered with an innovative database company and ported the UNIX Sybase database engine to their 16 bit environment, and in the process were learning all sorts of interesting things about Client-Server development which led to the development of protocols that eased task of connecting to a remote database resource called respectively ODBC and DAO (which eventually became ADO and ADO.Net, after passing through a transitional period known as RDO).

The compiler company now had a network capable operating system, a full set of Office Productivity Tools (with embedded macro languages), a couple of decent database engines for storing data, a new “open” protocol for accessing data in databases, and a compiler that made it fairly easy to build forms that connected from from the client computer to databases, and other external resources. This setup, even though no individual tool was better or more innovative then other applications on the markup, taken together made the lives of business people easier.

The total was so much greater then the sum of the parts in fact that millions of custom applications were made for that platform by a widely divergent groups of individuals with widely varying skill sets.

This is one of the great plateaus of Information Systems, as this configuration has remained largely intact to the present, and looks capable of continue to stay in roughly this same configuration in the business world for decades.

Tuesday, September 14, 2010

The Innovation of Sharing

Social Network, a disruptive new innovation, is starting to shift people’s online behavior, even to the point that the meaning of "online" has even shifted from web based to mobile based.

The new world wide web will be not only based in mobile, but it will be built with applications and tools that put both people and peoples associations at the center of all activities.

Instead of content, you will have posts, and those posts will be "points of sharing", and that interplay between points of contact, "nodes" will create social network "ties."

Facebook is without doubt not only the market leader as the most used Social networking site, but also is the leader in crafting a platform and set of tools to take advantage of the new opportunities on offer in this new ecosystem.

This has rather large implications for builders of internet tools (previously web sites) and for anyone engaging in commerce on the web. The way people desire to interact with commercial offerings is changing.

Monday, September 13, 2010

Sun Microsystems - Goodbye and thanks for all the UNIX

Sun Microsystems is no more, having been purchased by a company (Oracle) that seems intent on capturing "open" software and hijacking that software for short-term gain. Certain aspects of the Oracle deal just point to Oracles acquisition of first InnoDB (the storage engine that MySQL uses) and then MySQL itself with the acquisition of Sun Microsystems, as a method to hijack the MySQL open source project.

But Sun’s founders, were amongst the creators of FreeBSD (through Bill Joy’s work on BSD UNIX), and Sun was always the strongest large Corporate proponent of "Openness" in software directly creating Java, MySQL, and having strongly contributed to the vision that produced Linux (through open UNIX work), and PHP (which owes at least some of its inception to Sun’s java server pages). The whole LAMP stack traces at least some of its roots back to Sun (Apache owning the least to Sun, but you get my meaning).

I think we are not taking account of the large position that Sun had as an intellectual proponent of openness in software. And I for one will lament its passing, especially as Oracle starts using Sun Intellectual Property as a patent-troll to quash innovation in the space.

Computer History Museum - The Facebook Effect with Mark Zuckerberg

An area of the web that has, for me, consistently a very high level of quality of discourse about computer engineering is the Computer History Museum. Its YouTube channel may be found here, http://www.youtube.com/user/ComputerHistory/videos.

I enjoyed this interview with Facebook founder Mark Zuckerberg, found here http://www.youtube.com/watch_popup?v=_TuFkupUn7k&vq=small. I found it refreshing to hear Mark (I can call him Mark, because he is one of my facebook friends), talking about the Innovation of Sharing. He tried to discuss the potentials of "sharing" over a long time frame. How this would impact the software ecosystem, and how it would create new business opportunities. Of course everyone wanted to talk to him about privacy, which was much less interesting.

Sunday, October 18, 2009

Narratives in Programming

“...The programmer, like the poet, 
works only slightly removed from 
pure thought-stuff. He builds his 
castles in the air, from air, 
creating by exertion of the imagination. 
Few media of creation are [...] so readily 
capable of realizing grand 
conceptual structures...”
       -  Fred Brooks

Saturday, October 17, 2009

OOP Semaphores
Byline: Smoke signals from one developer to another.

I would like to point out the very obvious, that too often gets lost in the shuffle.  The code choices we make when designing components act as a series of “signals” that we send to the next developer working on the library or to the developer who is applying and implementing the objects we have created.  When we are crafting our code, our choices of class and parameter names, member protection levels, and constructors act as the primary method of disseminating the indented usage of our code.

If an organizations code base is the most inclusive repository of business process knowledge of an organization, then our code libraries supply the nouns, verbs, and adjectives of the business process.

Name for global business understanding
Most business processes have a generally accepted internal name.  This is the name that the stakeholder community will call something when telling their business story.  Using a different name, as in the statement, “to me blank is this,” when your user community has consolidated on a defined designation, is obfuscation, and to me, smacks of hubris.

Sometimes, you must rename, for elucidation, like when swapping out an acronym or abbreviation for a “Plain English” term that more fully explains the intended usage.  But this means you will have to disseminate the new term and usage throughout the hierarchy you are involved in.

One of the best reasons I can think of to develop documentation artifacts before the creation of a code base, is so that you have a more complete set of thoughts on the business processes, so that the names you choose will be indicative of the actual use of your objects in the wild.

Code the meaning of your Intended Usage into your implementations
So lets talk about the following code:
  
 Fig.1  
    interface iAnimal
    {
        string Name { get; set; }
        object Vocalize(object SpeechType);
    }



The Plain English meaning of this code is that all objects that implement iAnimal will have at least one string type property called Name, and a method called Vocalize that takes a parameter of type object and returns an object.  By creating an interface (a code pattern that set a base contract of what will be contained in an object) the architect of the code, signals to other human beings implementing his work, that this is the minimal acceptable level of code to implement this functionality.  If you implement from this interface, the assembler will only compile your object, if your code object contains all the members specified in the interface. Those the contextual meaning of the code is “you must have all these pieces.”

Fig.2
    public class Animal : iAnimal
    {
 private object _vocalization;       
 private string _name;

        public string Name
        {
            get { return _name; }
            set { _name = value; }
        }
       
        public object Vocalize()
        {
            return _vocalization;
        }

    }


Figure 2 shows the terms of the code contract being met in full. So we have a cycle of intention and usage, playing out into fulfillment, Contract spelled out by the interface, and then Contractual Obligations met in full by the implementing class.

“What I would like to stress here, is this is not just a series of assembly instructions, this has inherent intended meaning from one human being to another”
So how about when a developer sets up a class as abstract, the Plain English meaning is that the object must be inherited to be used. The compiler will not build if you try to call new on the abstract object.  But the contextual meaning that the code’s architect is attempting to send to the implementing developer is that the architect’s intended usage of this code is as a base class that is meant to be inherited.

Sunday, July 12, 2009

Planning for Risk

Probabilistic risk assessment, and benefit analysis. Probabilistic risk assessment is a methodology to evaluate risks associated with a technology endevor.

Risk in a Probabilistic risk assessment (PRA) is defined as frequency of a hazard occuring from an activity or action.

In a Probabilistic risk assessment (PRA) , risk is characterized by two quantities:
  • Severity: the magnitude of the possible adverse consequence.
  • Probability: the likelihood of occurrence of a hazard.
Probabilistic risk assessment (PRA) answers the following three questions:
  • What can go wrong with the studied technological in question, and what are the initiating events (undesirable starting events) that lead to the hazard (adverse consequences)?
  • What and how severe are the hazards (potential detriments), or the exposure (adverse consequences) that the technological entity may be eventually subjected to as a result of the occurrence of the initiator?
  • How likely to occur are these undesirable consequences, or what are their probabilities or frequencies?

Risk Analysis
HazardSeverityInitiatorProbabilityExposure
1Project Time OverrunModerateLack of or incorrect Project ManagementHighSoftware not released on time
2Cost OverrunModerateHazard 1HighCosts Skyrocket
3Crash Project - Additional Resources allocated. New team members are not properly trainedHighHazard 1, after time overrun is identified new resources are allocated to projectHighCode base becomes higly unstable
Additionally your full risk analysis should include a cost-benefit analysis, as well as opportunity costs (strategic), with a measurement of the risk of benefit shortfall.

Strategic Planning is a set of plans that take into account other players responses to events as a plan unfolds. This is different from a process plan that outlines in linear form the steps to achieve a goal.

Wednesday, March 25, 2009

How can Business Objects in Code help us tell stories?

A business object can be set up to offer our code structure an actor. We should endeavor to keep our actor and its associated UML User Story actions from our programmatic plumbing, the functions like "save to the database" and "serialize to xml."

Encapsulation tells us that we should engage in data hiding practices, so that programmatic plumbing functions can be held in base classes and that functionality can be inherited, keeping the implementation details out of our User Stories. Once we get the implementation details out of our utilization code, we can get closer to representing the User Story directly in our.

Personally I think this would go along way in keeping our thinking from getting muddled and also saving us from spaghetti code.

Sunday, December 28, 2008

Software Engineer Code of Professional Conduct

ACM Code of Ethics and Professional Conduct

Software Engineer: Phases of Professional Development

Software Engineer Developmental Rubric

DimensionJr. TechnicianMid Carrier LevelTeam LeaderArchitect
Training Learns job requirementsTeaches requirementsTeaches Technical ConceptsDisseminates cultural values
Operating ProceduresTold what to doStandard Operating ProcedureBest PracticesPatterns and Practices
StandardsWrites code, waits for others to testTest functionality, will settle for working functionalityOversees build, replaces work that is substandardCreates implementations of concepts
RequirementsOnly works on requirements that are supplied.Gets requirements from stakeholdersIntegrates stakeholder requirements with scopeIntegrates requirements, scope with systems architecture and corporate vision. Sets Technical Vision
VisionDisregards vision, too distracting, too many other things to worry aboutDisregards Vision, believes it is just another stupid HR initiativeUses vision to generate excitement.Set vision goals to expand the software community ecosystem and incorporates altruistic agenda.
Value Systemdoes not knowpersonal ethicsteam ethicscommunity ethics
Relationship between Effort and ResultBelieves others are "smarter"Believes excuses, if only everyone else would do something then this would work.Takes responsibility for outgoing work.Integrates contemplation of software concepts into leisure activities. Incorporates professional growth into Personal learning and private conversations.
Programming StyleIn linesome encapsulationclass hierarchiesAsyncronism processing, prototyping, Interface building
TestingExpects others to testAd-Hoc TestingTesting Against use casesAutomated Testing Suites


ref: Appendix A: Memetic system for the historical development of developmental research; Dr.D.K.Dirlam

Software Engineering Culture, Shared Cultural Traits

I have given some thought to traits that are common within individuals in the software culture.

Common Cultural Properties:

  • Describes learning as exciting
  • Caution - Pessimism
  • Long Work Hours/Personal Responsibility - Doing anything it takes to get the job done.
  • Strong Personal Sovereignty
  • Anal - Process oriented
  • Oriented towards knowledge acquisition
  • Prone towards empirical experimentation
  • Skeptical - proofs require measurable result

It is easy to see that when asking a knowledge worker if “learning is exciting?” All will know the only acceptable answer is yes. Not only would their peers would make fun of them (peer pressure?), but ansering in the negative during an interview will most likely cost you the job.

The cautious streak comes from the fact that we work in environments filled with possible problems with implications that can get us fired. If someone foolishly destroys the database server and all staff and all customers are not able to perform work related functions for days, there can be quite severe repercussions. We learn caution from getting burned a couple of times.

On project deadlines we all ramp up towards the end, crashing project timelines, leading us to trying to get things finished by the imposed timeline, which does not represent the amount of work that needed to be done, but the desired time to finish the work. We get used to very heavy workloads, and very late evenings. Most of us develop mitigation strategies like “work from home, ” to offset this. Even still it is common to work until 2am at least 5 times a year, in even the most relaxed environments.

I think a large part of the personal sovereignty comes from the monetization of the software engineer skill set. The negotiating leverage is useful far beyond acquiring money and benefits. After a knowledge worker achieves their financial goals, they can use this leverage to have a say in the development of the environment they work in.

Software engineers are process oriented as a group. The work attracts Type-A people and rewards “anal” behavior.

Knowledge acquisition is rewarded in our culture with more money, and career advancement, this positive re-enforcement causes this trait to be common amongst member of the knowledge worker community.

When dealing with complex problems that need to be solved I have heard, “when the going gets tough, the tough get empirical.” To ascertain the root cause of a wickedly complex issue (a quandary all software engineers find themselves in) empirical methods to test hypothesis is a necessary debugging tool.

Our skepticism and desire for measurable truths results from the relationship software engineers have with the business. In bad environments (you know we have all been in them) we work with the subjective emotional experience of “The Business” and try to convert that to testable rubrics and machine instructions.


Friday, December 26, 2008

Software Engineering Culture, a little background

The knowledge worker culture as we know it today began with the acceptance of the UNIX (AT&T) operating system in the 1980’s. The software language C(1) that was built to run the UNIX environment is the root of our modern systems.

In the late 1980’s modern database systems were completed, and added support for a common extraction language (Structured Query Language, SEQUEL developed by IBM) and support for Open Database Connectivity (developed by Microsoft) was added in the very early 1990’s. OS2, a joint project of IBM and Microsoft, which morphed into the Window NT family was started in the late 1980’s and released in 1993 was the final fundamental step that completed the move to the modern computer era. By 1995, with protocols like TCPIP, HTTP, OLE we had a platform that significantly resembles the modern age of software.

Most of the people that I know in the industry spent 1989 through 1996 developing data retrieval code for Windowed based GUIs. From 1996 to about 2001, 2002 the industry scrambled to implement web sites and web applications, with out much knowledge about their tool sets. So from 2002 to today, we have been cleaning up the mess we made in the late 1990’s.

So when people talk about the pace of change in software, I am taken aback. We have had two decades to get used to the current environment. The last big change in the industry came with Microsoft’s release of the .NET framework in 2001(we'll talk about SOA later, it came before this, and does have a large paradigm shift grade impact), and that was a shift that took the code base back to the Object Oriented Programming Ideals(2) of C. What .Net represented was not so much a rethinking of software, but a nanny system that forced bad programmers to correspond to established best practices.

The problem with having an idea of the changes in software is that the code changes from year to year, as better version (revisions) of software are released, not that the paradigm of development has changed. Add the fact that computer systems have immensely complexity, to this and it could appear that every thing has changed, when we are only discovering what we did not know last year (but really should have).

So, that almost unanimous statement that “Change” is part of IT culture is something that I always wince at, when I hear it. The only universal changing thing in IT is knowledge-workers abandoning their respective employers for better opportunities.

So what is the Zeitgeist of IT Culture, what are the traits that we share? Unfortunately I will need to weed out those people who write code as only their job, and talk only about Software Engineering Culture.

  • Need for constant process improvement. “If I am not improving things here, I can just go somewhere else that I can make a difference.”
  • Control over their own personal sovereignty. “If you don’t like it, I can just leave.”
  • Aggressive negotiating skills. “I got five hits today on my monster resume.”
  • Need to achieve to their own level of quality. “You don’t own quality, we all do.”
  • Aggressive self-study program. “I will implement that new third party software in three days, I will learn it on the fly.”
  • Consistent desire to implement new ideas. “Check out this recursive function I made!”
  • Desire for a comprehensive solution to problems. “We need to fix the underlying problem, not just patch the patch.”


(1). C was a class based language with support for functions, it spawned C+, C++, and Java and is the basis for Microsoft.NET
(2). With some very cool API features and Visual Studio 2008 has a great code editor, but it still is a direct descendant of C.

Learning Culture in Software Engineering - Preamble

So I have been at this for over two decades now, and have been close to software development for 26 years. During that time, I notice a pattern, which has been 100% accurate. All software engineers who are very good at their job are dedicate to life long learning. They all have a robust self-study program that they control, and allows then to master new ideas.

Often they are mistakenly thought of by others as being smarter. But in truth, they are simply diligently working a self-study program.

A self-study program is important because it trains the software engineer in acquiring new information quickly. It helps train the individual in the art of making informed assumptions that speed up knowledge acquisition. It shows the individual the size of knowledge chunks that they can consume, and through repetition trains them to be able to absorb larger information groups. The self in self study is important because it trains the individual in being able to go after information and knowledge on the individual’s own time and to correspond to the individuals life flow.

So if one believes that repetition leads to enhanced skill uptake, then following a self-study program would improve an individuals overall learning capacity.

Learning and being smart are core culture traits of the IT industry. This culture of learning in IT is nourished by Higher Education Establishments, Large Software Venders and by individual engineers. It is a common thread amongst all knowledge workers.

So we will want to look at this cultural trait a bit to understand it.
How do you foster and grow a culture of “life-long” learning?
  • How does “life-long” learning benefit you personally?
  • What value does “life-long” learning bring to organizational structures?
  • How is “life-long” learning taught and passed on to incoming members of the community?
  • Who makes their resources freely available and what benefits does that bring?
  • Where does “life-long education” fit into “life-long learning?”