Inventory Control Software- 10 Second Rule - Part 3

If you have been following this blog and been wondering how DataWorks is planning to compress time and jump over some of the low hurdles lined up along the space-time continuum you can leap over "The Boring Bits", and race down to the plot-spoiling, "10 Second Solution" finish line below.

The Boring Bits

When we started designing NeXT® back in 1998 our primary thoughts were:  get the inventory data model correct. The first line of the ARMS™ system had been written way back in 1984 - two years before DataWorks was even founded (more on that later).  It was a very good system for the time, but it had many design limitations that prohibited it from growing into an enterprise system capable of managing all the purchasing and inventory needs for the hospitality and entertainment industry.

NeXT was a total rewrite. We used all of our collective experience to scope out a system that would tackle all the problems with the ARMS system and take DataWorks into the 21st century.

ARMS had a whopping 67 data tables that stored all the information of our inventory control system.

At last count, NeXT has over 520 tables -- and it is still growing.

If you put the two systems side by side, entered the same purchase order, then received the same partial packing slip with a 100 products, ARMS would actually beat NeXT in a "save-the-receiver" race.  Two reasons:

  1. We are tracking and storing MORE information with NeXT.
  2. ARMS was written with an 8 bit 386 processor in mind.

We knew that we had MORE data, but we figured Moore's Law would take care of the details. We really banked on  Gordon E. Moore's prediction that the number of transistors on a CPU would double every two years. Those additional transistors would translate into faster processing time and what ever programs we wrote would be quickly executed by Intel's chip du jour.

Moore's law has held true, (and probably will until the transistors reach the molecular level - 2015 or 2020 depending on your favorite  forecaster/futurist) but the speed of the computers have not doubled  every two years.  Computer speed has plateaued.  Back in the late 80's we purchased a new computer  every year. Jumping from a 286 10 MHz to a 386 16 MHz or getting a PC with a red turbo button on it, meant you were going to get some immediate productivity increases.  Now, our R&D group only gets a new PC once every three years.  The computer I use today to  write code with is maybe 20% faster than the work station I used three years ago.

Maybe someone can find a graph that compares transistors density to processing speed and add it as a comment.

The other technology we bet our ERP ambitions on were hard drive space and hard drive access time.  Back in 1986 a 20Megabyte hard drive with 65 millisecond access time was standard gear. We had to be very stingy with the data we kept - in fact the first versions of ARMS did not have historical sales because there simply was not hard disk space or time in the day to save it or report on it.

Think about this metric. Hard Drive storage space has kept pace with transistors (I think there is Law for that too, but I don't know what it's called. Comment anyone?), but access time - the all important go find the data and present it to the end user benchmark- has been sitting on top of a  plateau for a long time. I remember the huge jump in 1988. We went from 65ms to 24ms access time and ARMS tripled in speed over night. I think we actually jumped from our desks and cheered.  Today' s hard drives (circa 2010) typically have 9 millisecond seek times.  Now DataWorks does own some  really fast gear that is used in our ASP server farm - I understand those drives have 2 millisecond seek times.  So what is that in terms of speed improvement? To go from 24ms to 9ms is great - but - it is only 2.6 times faster than 1988.  For a 22 year time frame that is really not much of of an increase.

What does that mean?  In 2002, when we shipped NeXT, we got the system we designed, but we shipped a slower system then we expected.  We wanted a fast turning, quick accelerating, dog-fighting F-16 Falcon, but what we delivered was a big, heavy,  F-105  Thud. (This is a shout out to Col. John Boyd - more on him later.)

One trend that appeared in computer design that might have helped us speed up our application was the introduction of dual and quad core computers. That means 2 or 4 or more CPUs are huddled inside one computer. If the first CPU is busy, the second, third or fourth CPU could theoretically pick up the slack and crunch the data we are saving.

All that is well and good, but that is not how inventory transactions work. As I described way back in Part 1, inventory transactions are like a factory assembly line, you step through each process and do each step in sequence until you get the final product.  In our factory that final product  is recording the event, updating the merchandise value, satisfying the accountants, and generating analysis data.

What this means is that we can not compute on hand values independently from on order values. It would really speed things up if you could assign each CPU within the computer a separate thread of logic: You - CPU #1 - take care of the cost calculations; Hey buddy - that's right Mr. CPU #2 - you work out the new On Order quantity for the purchase order; you over there sulking in the background - Corporal CPU #3 - front and center, march over to the parade ground and  computer the On Hand values;  and Charlie CPU #4, step outside to the internet and grab the current exchange rate for the EURO - hurry up, because CPU #1 needs it to calculate the new landed cost. And - oh by the way - while all of you are concurrently working on the same inventory records, play nice and don't step on each others data.

Thread computing has very limited use in the business world that I live in.  If DataWorks was in the weather forecasting business we could use all that CPU muscle to concurrently crunch and test various models to predict the weekend beach forecast, the pollen index, or how many named hurricanes are going to make North American land fall.

DataWorks is first and foremost in the inventory operation's  business.  We do some serious forecasting for purchases orders , but our current needs do not require concurrent threads to predict the vendor-lead-time usage for an item, we do that by jumping over one hurdle at a time.

So to wrap up these boring technical bits:

  1. DataWorks is stuck with this kind of programing:  do step 1, do step 2, do step 3, ... until we finish with the last N process.
  2. The whole world is stuck on top of  the current computer-silicon-hard drive -hardware summit. The landscape ahead is gentle rising plateau - there are no big productivity jumps on the horizon.

10 Second Solution

First off - lets state the obvious, if you are running NeXT on your desktop in a LAN or TS environment, just start running another copy of NeXT on your desktop. If you got one copy crunching a big receiver, just start another session of NeXT and go work on that second task. We don't license our software by the seat so have at it. Now if you are using NeXT hosted on the DataWorks ASP Farm you could also start another session if you have subscribed for the additional log on (hey, we got to make money too).

But In version 7  we will adding something to NeXT that will make running multiple copies of the software seem like an 8-Track Tape in a 1975 Pontiac Catalina.  In version 7, scheduled for release in the 4th quarter of 2010,  we will be shuttling transactional processing off to a silent background program so that your current task can be released from the drudgery of saving data. You will be free to start another task -- rather than watch a thermometer bar inch its way across the screen.

The plan is for this work to be picked up by the NeXT Service. Currently, the Service really has a pretty easy job. It hangs out by the CPU's cooling fan,  just looking for something to do. It really only stretches it muscles during the night shift, when it takes care of  store polling,  processing sales, updating the inventory stock ledger, calculating turns, adjusting vendor times, and preforming inventory analysis.

Since the NeXT Service has so  little to do during the day, it will be told that an  inventory transaction has just occurred and if it would not mind, and if it is not too terribly busy,  would it mind picking up the job and take care of it in the digital back ground?

Here are the current tasks that NeXT Version 6 contends with:

  • Run the Sales Imports and Inventory Exports (which can be schedule as often as every 3 minutes)
  • Send email about polling problems to DataWorks for review
  • Refresh the Inventory Daily Stock Ledger with previous day's inventory transactions
  • Roll up Stock Ledger data for Period Analysis (13 week, 13 month, Q5, Q4, etc) for Time-Series forecasting methods
  • Compute Actual Vendor Lead Time and Time to Floor by Vendor, Vendor-Subclass and Vendor-Product
  • Compute Stock Turns for Sub-Class and Open to Buy Merchandise Classifications
  • Generate Suggested Orders
  • Roll up Stock Ledger Data for Open to Buy and Budgeting Modules

In version 7 of NeXT we will be adding these additional tasks to the NeXT Service :

  • Query and publish inventory reports to a subscriber of end-users.
  • Generate Suggested Transfers
  • Handle "large" inventory control transactions for the end user

In Version 8 of NeXT our  plans are to open up the Service events to the end user community and  allow expert users to configure and define their own procedures to be run by the NeXT Service.

That is when "things" should really get interesting.