Architected Futures™

Tools and strategies ... for boiling the ocean

Use Case History

Submitted by joe.vansteen on Wed, 12/28/2016 - 13:27

Introduction

The intent of this post (series?) is to provide some history regarding the background of EATS, and to provide some "Use Case" understanding for the system.

The technical history, or fragments of it, are documented on some of the neighboring pages in the blueprint. What I wanted to convey in this entry is how the certain "incarnations" of EATS were implemented in the real world, and how they were used. My "working" career was spend with Bank of America, in the San Francisco Bay Area. I'm a native. I was born here. I like the place. So, that's where I went to work. BofA was a place to "work with computers" in 1970. They actually published a book, which I bought. It listed all the companies that had computers, and what kind of machines they were, and how many. I read the book, and I ended up at BofA.

Some ideas that go into EATS are mine, and were formed before I worked for BofA. Other ideas are industry ideas that were developed before and during the time I spent working at the bank. Some ideas came out of bank projects that I worked on. Some ideas came out of the way that I did my work. My approach to problem solving. Which I think is allied with system thinking.

Approach

My approach involves conceiving of the "thing" that I am working on as a system, developing a form of mental model of it, and then working the model system to attempt to find a solution. I would call that a version of systems thinking. And, a lot of people do that. However, I'm also a reasonably competent programmer of computers, and I have a blacksmith, mechanic, craftsman gene. So, I don't just build mental models, I develop my own computer codes to help me manipulate and work with my mental models. And thus, EATS was born.

If you look at modeling activity as dealing with three components (model, model data, modeling tool), you can see the projects this way:

Use Case History
Tool Project Technology Time Frame
DModel Use of DModel as tooling support behind the specification of an investment banking and portfolio management component of an international bank. SAS 1980s
SysModel Use of SysModel as tooling support behind the development and testing of a replacement technology for teller transaction processing throughout the State of California.  C 1990s
EATS In process. General purpose architecture management tool. Java 2000s
Annie Started. Architecture management AI framework.

G-Suite

Bluemix

2017+

I hope that by providing a "use case" background, below, some of you will be able to better understand the concept of how EATS can be used. It provides "another viewpoint."

DModel

Model Composition and WeavingAs noted in the Genesis page, the model and the model concepts for the investment banking project predate the DModel tool. But technical management of the model artifacts was a burden. (The model being a model of "investments" as done at BofA.)

At the time, circa 1984, I was working in the Investment Division. I was responsible for automation support. I had worked with the division planning officer to assemble a strategic business plan, and I was responsible for the automation component. We had compiled a high-level strategic architecture for how it was desired to implement automation to support the division's functional responsibilities. That was accomplished primarily with "word processing" support.

For our next step, we wanted to detail out the architecture to a level of computer system specifications. We brought in a company called the Western Institute of Software Engineering, or WISE. And they brought their methodology, called WISDM. Which was largely a version of JAD sessions.

Using their methodology for group sessions and workshop, we developed what were defined to be about 13 models, as lists. These defined the entities and "things" in the environment and in the internal operations which were primary to the business system. Things were categorized into the 13 groups, and templates were used to define each (pseudo-elementary) item. We then developed a series of models defining fundamental relationships between the items on the lists. The lists were of business functions and business data. The models connected the lists using relationships like the ones shown in the individual blue boxes in the diagram to the left. All defined relationships were "simple sentence" relationships, essentially RDF statements for a limited set of prepositions. (Models in the Model Collection diagram which involve more than 1 relationship predicate are "composite" relationships. These show how fundamental models are aggregated to create composite models.)  As context, this predates Zachman's first publication in 1987 in the IBM Systems Journal, but the techniques were similar in terms of high level modeling.

The "Wise guys" as they became known, had a computer modeling tool on their 1984 PC, or early Apple, I don't remember which. And, overnight, after a day's workshops, they would run the model reconciliation software. They would compare the various models, and make sure everything was accounted for and used. And they would cross-check the models. Then they would bring in the "error" sheets the next morning, and the people had to reconcile the models. You had a "good model" when everything was accounted for, and everything reconciled and was used.

A problem arose because the Wise guys could only do one set of models with their software; and we built four models sets to cover the whole division1. Thus, DModel was born, to reconcile the four sets of models. I created DModel in SAS (a very high level statistical modeling package), on our in-house timeshare system. So, I had a lot more horsepower at my command than the Wise guys. Mainly I used SAS as a data parsing and regurgitation machine. It was a "way to get a job done." And, when the Wise guys published, they had very basic reports. I was able to integrate the models with template organized text specifications (produced on Wang word processors), into my "database." So I was able to produce specification sets that integrated the text with the model information. This helped the reviewers immensely.

When we were finished we had a map of the division's business processes that was rock solid. The division reconciled to the bank. The four major functions in the division reconciled to each other. A logical layer reconciled to a technical specification layer, and the technical specification clearly defined how people and automation should come together to achieve business objectives. Everything was cross checked. And it had been put together by the key business people responsible for doing the work. The department specialists.

Critical success factors in getting there included the organizational power of semantic encoding of the model data at a fundamental level; the ability to manipulate that data in higher level models; the use of a computer to keep the books straight and balanced as an audit; and the use of a computer to "crunch the data."

Shortly after this effort was completed, the bank was faced with an intense audit of the Investment Division. In response, I led an effort to further extend the models that had been developed with the Wise guys. The DModel system didn't have to change very much. I added some new semantic relationships, and some new "element" category metadata. And we created templates to collect the new data categories that were required. And I ran the models and worked with folks to reconcile the new material.

  • We defined a set of data called regulations and policies, and we coded up element instances to describe each government regulation which we were governed by. And we defined what the bank policies were associated with the regulations. And the compliance personnel mapped the regulations and policies to the logical model of the division which we had built.
  • We also went to each department in the division, and asked them to define and map how their area mapped to the logical model. Within departments, we defined "desks" as units of processing, and each desk was assigned to write it's desk procedures. 
  • Through this process, we ran the models using DModel, and "in time for inspection" we had an accurate, desk level, compliance model for every desk, and every regulation. Every desk had a desk manual, which tied to the logical model, and which encompassed a complete procedures manual for the desk2. And, in every manual, for every affected procedure, there was a regulatory or policy reference, if appropriate. And the compliance officer had a cross listing, in a large computer output binder, that indexed every regulation, and where and how in the division compliance was effected. And we had a 12 volume encyclopedia of "how things work" in Investments.

We passed the audit. Maintenance is another story. SAS was a good "up fast" tool, but it isn't a good long-term tool for this type of exercise.

Probably 90% or more of what we defined and did was documentation and modeling of what, at the time was, and in an expanded manner still is, a generic process. Those models simply define how a particular industry works. Only the last bit about departments, and desks (and obviously trading strategies), was unique to BofA. That's what are called "best practices" as tailored to a specific organization.

We sell "best practices" to businesses. But, most of it is generic, with a brand label. The EATS tooling does the same thing as DModel, but as a set of generic process models, which can then be tailored for individual application, or a brand label.

When I first started working on EATS in 2005, I bought a copy of Business Plan Pro, to help map out my strategy. I was hoping to find a DModel inside on the CD. I was disappointed. They have a very useful piece of software, and I used the product as published. Somewhat. But, what really struck me was their "business planning model" as a generic template with "flavor" variations for all the different industries they "tailor" to. That is a straight-forward extension for EATS. Compliance with regulations shouldn't be a problem for businesses of most sizes with the proper tools. It's generic. EATS doesn't really care about scale. It can run DModel exercises on any JVM, assuming access to an AIR Repository for the element being modeled.

SysModel

SysModel was a re-creation of DModel with a different focus, and different software, but based on the same fundamental architectural modeling principles. SysModel was created to handle a quality control challenge during our rollout of new teller workstations throughout the state of California. We had purchased a teller operations software package, and were in the process of writing "customizations" to the package to integrate it with the rest of the retail banking, online delivery, computer systems. This was 1989. The IBM PS/2 had recently been announced, and we were actively overhauling our retail banking operations with the new computers.

SysModel was implemented on my PS/2 workstation, using a BTrieve database. The code was written in C, and we used "whatever is your favorite text editor" as input capture tools. 

I was a late hire on the project, and I had been given the assignment of being the bank person on the project responsible for testing of the teller workstation functionality. In more specific terms, a contract employee and I were supposed to write the scripts, that other people would execute by hand with the new software, to verify that the system functioned correctly. And, we were supposed to supply a key-stroke exact set of scripts at the completion of testing, that could duplicate the exercise using an automated testing system. And coverage had to be complete, as per the user requirements. For a complete teller operations system.

  • My approach was to re-build DModel as what I called SysModel. That was the nucleus for "how to store and manage the content."
  • Then I constructed a metamodel, largely based on generic computer system development frameworks, and some tailored elements to define the specifics how the system being modeled fit into the framework, and specifics regarding how scripts needed to be accomplished.
  • Scripts were developed as a primary database report. They were produced in two forms (1) for people, (2) for the automated tool. The scripts for people were produced in two forms, one showing the keystroke specifications, and one without. Early testing used "without" and later testing used "with" because the keystrokes also had to be verified.
  • If I recall right, scripts identified references to both code modules and to functional specifications, for ease in reconciliation. And separate reports cross-referenced functional specifications to requirements.

We had a very successful install. 

 

  • 1. WE had divided the division into securities trading, sales, operations, and administration (planning, accounting and compliance).
  • 2. If a desk [a position, a job] performed multiple functions in the logical model, the desk manual integrated the requisite information related to those functions into one manual for the desk.

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
SystemsThinking