Inventory Control Operations and Transactions

While writing the prior posts about the 10 Second Rule in inventory control, I started thinking about our end-users and the forms they use to operate our software. Imagine that we could gather every inventory control system ever written by DataWorks (and while we are at it let's toss in every AR, AP, GL, Payroll or ERP application ever programed by any software company) into a huge vat of creamy digital goodness. Then using a yet-to-be-invented silicon-spatula, we would pry the user interfaces from these systems into a giant pile of one-dimensional pelts.

Next we would rake these skinned bits into the intake scoop of a yet-to-be-invented silicon-shaking machine. After clicking the OK button, we would adjust the lumbar support of  our Herman Miller Aeron chair (mine's black),  and watch as the mythical machine shakes and sifts the leafy bits though various mesh screens. Eventually all the input will be sorted into five unequal mounds of output. In order of their height and weight the piles would be grouped like this (you may not like the names of the piles, but this is my made-up machine, so these are my made-up names) :

  1. Confirm & Comfort - Little crumbs of programing chaff  that ask yes-no questions,  tell the user that something is happening (Hour Glass or Thermometer Bar), or will happen - once they click the  "OK" button.  These forms generate comfort to the end user - they let you know the software is working away on something, they ask a question to make sure you really want to change all the prices of the beanie babies to 19 cents. They add some flavor to the system, but they don't really do much. They are the cranberry laced croutons of a much bigger salad.
  2. Configure & Forget- rarely used after initial setup. Super simple to program. If you have seen one you have seen them all. (i.e. System Defaults, Colors, Sizes, Units of Measure, Classes, Departments, Margin Plans, Ticket Type, Currencies, Languages, Addresses).
  3. Daily Maintenance - highly specialized forms that handle the heavy daily lifting of inventory. Lots of code.  Speed and flexibility are important.  (i.e Product Input, SKU definition, Cost Updates,  Price Changes.)
  4. Show & Tell - retrieves and organizes data for your viewing pleasure. This group includes screens used to select date ranges for running reports  and forms used for looking up a particular data set (i.e.  Product look up, Sales Audit review, Comparative Sales Reports,  Best - Worst Analysis,  In-Transit report,  Inventory shrink, General Ledger Batch)
  5. Transactional - an elite group or  highly trained, highly specialized forms,  used to record an inventory control action. These forms are the work horses within any inventory control system.  (i.e. Purchase Order, Receipt, Transfer, Markdown, Inventory Adjustment, Return to Vendor).  Lots of Code.  Lots of business logic.

Transactional forms are the core of any inventory control operations system and the focus of this post.   If an action is being recorded that changes an inventory item's on-hand or on-order value (quantity, cost and/or retail) these are the forms that are used.

A characteristic of a transactional form is that it typically has two sections:  a header section and a detail section.  The header is one row of data that captures at the very least the who and the when of the transaction.  Who = The employee who preformed the action. When = The date and time that the transaction occurred. The Detail section contains the "What". What SKUs are being received. What products are being returned to the vendor. What items are being transferred.

We create and maintain a database application that in at it's genetic core  is an accounting system with a very thick veneer of operational epidermis.   Now this may surprise you but here is is what bounces against our head-bones when we are creating or enhancing our inventory control application:

  1. First and foremost is this question: What data do we need?
  2. The next question is:  Is this data related to any existing data?
  3. Thirdly - and this is where business logic, programing, and all the accounting bits come into focus -  if this data changes will it effect any other data?

In a transactional form, the tricky bit about all this accounting and business logic is that there is not just one piece of data moving from a 0 to a 1, (or from a 1 to a 0) but more likely there are hundreds or thousands of data bits queued up that need do their own debit to a credit dance too -- and if one bit changes then all the bits need to change.  It is an all or nothing process.  If you start the transaction the software needs to be able to finish what you have kicked off.  Even - and this is very-very tricky part - if the computer power supply dies in the middle of all the fun, or if the SQL Server connection drops offline, or some other random act of chaos says, "Hello, nice to met you" to our finely crafted inventory control software - the application needs to be able to pick up where it was unceremoniously dropped off and finished the transaction.

So with that said, the receiving transaction generates a lot of exciting data changes (increment the on-hand, and decrement the on-order), and the transaction has to leave some cookie crumbs along the way so if the process dies, Hansel the handy programmer, can code in a method to find a way back to where chaos stepped in and then finish the march to grandma's house.

This is one of those areas that for those who know a little about this industry, but never actually had to write a application (read "Consultant") would rattle off something like, "Hey why are you complaining? Any SQL Server engine worth it's license fee has transactional  roll back, so that if the transaction does not commit you can just let the SQL Server clean up the mess.

Two things that the consultant may not have real-world experience with:

  1. Transactional buffering wins the slow-on-the-go award.  If there are one or two records to update, its not a problem. You got a customer address to update - not a problem.  But take 10,000 inventory items that have just been counted in a physical inventory and they now need to be marched out to the data play ground en masse and updated as one big happy collective body of 0's and 1's -- you got yourself a real programming puzzle.   All the tools that Microsoft, Oracle and Sybase offer us for transactional buffering and saving will have us out playing though recess, and will likely have us out there after school. We know because in version 1 of NeXT®, we used transactional buffering. Time to save a physical with 10,000 updates - more than an hour (sometimes MUCH longer). Now - after rewriting the upload and update routines by adding our own "state-of-the-transaction" bread crumbs and eliminating the use of off-the-shelf  buffering and transactional roll back - less than  3 minutes.
  2. If a transaction can launch another transaction (Receiving product into a Warehouse, generates  a cross dock Transfer Out,  and based on options the Transfer Out automatically creates a Transfer In) you have a an ever bigger and tougher transaction mess to code for and handle. Each transaction launches another transaction. What if the last part of the Transfer-In fails? Well then you got to roll back all the other transactions from a completed state to a pending or in process state.

(By the way, I like consultants - they occasionally introduce our products to clients. And who knows, when I grow up,  I may want to be a consultant someday.)

Another characteristic of the transactional  form is that they  are "The Event" or they are "The Record" of a real event.

If they are "The Record",  then these digital events are tied to actual pieces of paper that are the source documents. They in essence mimic a document that was created by hand or generated by another computer system. Think about going to a merchandise mart in  Atlanta, Dallas, LA or (my favorite - because I use to give Open to Buy lectures there) Miami, and placing a hand written PO with a sales rep. That hand written PO is the source document. When you get back to the hotel or back to the office, you pull those hand written docs out of your attache case and hand them to your assistant - Jill or Jack - and they go up the hill and type them into the system.

If the system's transactional form is  "The Event" then the system creates the source documents. Transfers from a warehouse to store, or transfers from store to store are good examples of that. They produce a document that lists what SKUs and how many have been scanned in

Their main purpose is to update the data that is read-by-end-user, write-by-system-only using the business logic that is embedded in the system's source code. We don't want users to just change an item's inventory on hand value without recording what caused the change. (By the way if you do want to change an on-hand value without any business logic, just use a spreadsheet, there is  no need to buy our database application or any other business application for that matter.)

If it was not for the business logic (and the never ending, feature bloating, endless stream of exceptions to the business logic), creating an inventory control application would be a pretty simple matter;  DataWorks  would not have spent 24 years working on it; I probably would not have this job; and there would be no need for me to be typing this blog.

Thanks for the opportunity.

Inventory Control Software- 10 Second Rule - Part 3

In version 7 of NeXT, scheduled for release in the 4th Quarter of 2010, we will be shuttling transactional processing off to a background service so that your current task can be released from pushing a thermometer bar across your screen.

Retail Inventory Control Operations - The 10 Second Rule - Part 2

Our retail inventory control operations happen in the back office -  typically far away from the public eye.  The general public never sees our software in operation.  Our enterprise clients typically have buyers huddled as a corporate team and receivers staffed in one or more warehouses around the country. In single property operations, the store manager usually has their workstation in the midst of  the store's back stock where they do both the buying and the stocking for their outlet. DataWorks end-users  are behind the scenes,  making the decisions and preforming the prep work that enable all of the front of the house operations to work.  They decide on the right mix of merchandise, the proper merchandise presentation, the target margin, and when product needs to be marked down and moved out.

They also have a sense of how long something should take to accomplish.  In our last post I talked about how the receiving function is the most heavily used feature of our software. I said that we strive to live within a  10 second rule for our inventory transactions;  yet, I admitted that this function typically takes much longer than 10 seconds to process.

In our home we use to live by the "5 Second Rule".  If you drop something on the kitchen floor, and it has been there for less than 5 seconds you can plop it back on your plate, splash it into your bowl, or drop it into you mouth. More than 5 seconds? Meet Mr. Garbage Can.

We now have three reasons that make this rule obsolete:

  1. Our 3 year old son is not fond of washing his hands.
  2. Our maid is a Roomba robot who is long on determination, but short on suction.
  3. We have Ringo - the pet ferret - who was last seen headed behind the dishwasher with a naked barbie doll clenched in his jaws.

So, if ANYTHING touches the kitchen floor in our home it is walking down the long dark hallway for an interview with the garbage can.

On the opposite side of this plate is my culinary rule when camping. If  dinner drops to the dirt, I give it a brush, blow on it once, then eat it up,  grime, grit and a grin - regardless of  elapsed contact time with terra firma.

A person's sense of cleanliness or sense of urgency depends a great deal about where they are.  The context of "where" determines our expectations on how long is too long.

Put another way: walking down a dark alley is not the same as a stroll in the park.

Now that I have wandered off  topic and stumbled over my metaphors, it is time to loop back and pick up the post where I dropped my chain of thought...

Our inventory control operations happen in the back office, far away from the glare of the public.  But our point of sale partners are right out there in the public's line of fire.  When one of our bar-codes is scanned, the price is expected to "instantly" appear.  When the sale is totaled we  expect the tax  to "instantly" appear.

Besides being done in front of the public, the point of sale transactions has some other special distinctions about the end of the transaction that separate it from the completion of other inventory control operations. The point of sale transaction has one or more connections to mechanical devices that can mask the time it takes to save the transaction.

  1. Printing of the sales receipt can mask the time to save the transaction to the database back-end.
  2. Popping the cash drawer will absorb a second or two of transaction time.
  3. Rolling coins down an automatic change dispenser chute will amuse patrons,  keep the cashiers hands clean, and certainly hide any digital needs with analog springs.
  4. The prompting of a signature or the requirement of a PIN input by the customer puts a great deal of slack time into the transaction.
  5. Verification of ID, with the inspection of a Driver License Number or Phone Number eat up enormous amounts of CPU clock cycles
  6. The Credit Card authorization can absorb any residual transactional latency.

The credit card authorization really is a separate financial subroutine event that happens outside the inventory control transaction, but it holds the final keys to locking the transaction down.  When the credit card is swiped we expect the approval to occur in -- what?  Two seconds?  Five seconds? Ten Seconds?

My experience is that when any sales transaction takes over five seconds (credit card or no credit card) the clerk will utter the universal face saving, "the system has been slow all day" disclaimer. (The exception to this is the patient staff manning AT&T cell phone stores, their transactions always take somewhere north of  30 seconds - and they never blink until the 45 second mark is passed)

When the credit card authorization subroutine takes more than 10 seconds, I start fanning my wallet,  looking for cash or alternate slices of plastic because I figure that the credit card processing company is about to bingo my card - a big blue banner of  broadcasted unhappiness is about to scroll across the Verifone's display.

Exception to the anxiety index?  Black Friday or Christmas Eve - I expect to wait for the authorization subroutine to complete and the delay will always be the result of the extra demand being pushed into the system - it is never my problem on those days.

Consider this - our retail inventory control operations typically occur as solo events --  no one is around to share the moment with, no one else can hear the  "this takes too long" statement. Additionally,  there are no mechanical contraptions to print, pop, roll, amuse or distract us from the event. All we offer are thermometer bars that mark time to the rhythm of the saving records. With the lack of distraction,  I believe our inventory control transactions seem to take longer than point of sales transactions even when they take the same amount of time. Of course how many 100 line item POS transactions have you ever heard of?

Having just said that, I remember that back in the late 1980s and throughout the 1990's our SCO Xenix and MS-DOS applications were designed to begin a print job  immediately on the save of a transaction. As soon as the receiving event was told to process we would start printing out a "report" that would document the event for accounting and paper back up. The first step of our transaction would be to dump the contents of the transaction out to a Dot Matrix Printer.  The print command would take a half a second or two to push, and once completed we could continue with the work of quantity, cost, retail and accounting updates. The printer would take a good 30 -60 seconds to weave fan fold paper under its 32 character per second print head and when the printer ejected the paper up to the rip line, the transactional data was safely saved and waiting for the next transaction to begin.

We now design our back office inventory control operations to only print on demand. A document that can be printed to the screen, or emailed to a department, or published as a PDF file has taken away our ability to hide the transaction's save in the shadow of the printer. Plus we feel pretty good about our green policy and the fact that we are saving acres of pine forests from being bound into journals full of DataWorks documents.

Having just pulled those two decade memories up to my frontal lobe, it might be possible to design the receiving transaction to spool up the bar-code ticket printer at the beginning of the transaction with the need for hang tags or adhesive sticker stock.

I have done a fair bit of political back peddling on this post.  I declare that we have a 10 Second Rule and then do a great deal of mental pondering on modifiers to the rule. The 10 Second Rule is a design goal.  Whenever I hear there is a "Rule" in computer programming,  I start toward the fire exit,  looking to protect myself with a new software vendor. Rules are created by programmers, designers or companies who have either a mental block, a framework restriction or a language weakness that is preventing them from doing something better.

Hopefully,  I have painted a picture that our inventory processing time is indeed relative to the where and when it occurs. Both the "where" and the "when" have something to do with our sense of urgency and our expectations on system performance. The next post will be "when" you will not have to wait any longer on how DataWorks will deal with transactions that do take over 10 seconds.

Retail Inventory Control Operations - The 10 Second Rule - Part 1

Retail Inventory control operations have been DataWorks'  focus for the past 24 years. We design systems to assist buyers in creating purchase orders; receivers to process packing slips, auditors to review costs, and controllers to generate accounts payable invoices and credit memos.

One of our design tenets is that user input time on heavily-used-forms is very expensive - so we try to spend our CPU coin wisely.  If we can shave a half a second off a process, that half second gets multiplied by all our end-users doing the same process every single day.  A billion here, a billion there, pretty soon it adds up to whole lot of time for running reports, re-ordering fast movers, or dialing in a pizza.

Notice that our tenet has a condition - "Heavily-Used-Forms". If a form is used to launch a one time process or it is a rarely used configuration option, we don't spend our R&D budget on making the form into a high performance, code-injected hot rod.

Speed to save and process a transaction is of extreme importance. If a transaction takes over 10 seconds to save then the end-user's cranium shifts into idle and the neurons start filling their synaptic gaps with sudoku puzzles and we fail in our efforts to get more data for the price of one transaction.

The receiving transaction is the core of our inventory control operations -  It has a lots of moving parts and it generates a lot of transactional heat. When the receiver clicks on the process button, he or she is kicking off a big assembly line of linear processes:

  1. Accommodate Units of Measure, Currency Rates, Terms Discounts, Freight Charges, Cost Changes and Vendor Allowances
  2. Consider the employee's access rights and privileges
  3. Update Inventory Quantity On Hand, Quantity Received, and Status
  4. Flag an Item for the need to update the Stock Ledger *
  5. Update a Purchase Orders' Quantity On Order and Back Order Status
  6. Calculate Base Cost, Net Cost, and Landed Cost
  7. Update a log for any shifts in cost or retail
  8. If merchandise price tags are needed, generate a Ticket Batch
  9. If a Packing Slip, generate a General Ledger journal entry to book the Asset and PO-Clearing accounts
  10. If an Invoice, generate an Accounts payable Invoice with the appropriate distributions.
  11. If  allocating to more than one outlet, generate one or more Transfer-Outs.
  12. If Transfers are set to process in one step versus two, process one or more Transfer-Ins.

The bad news is this  gymnastic routine takes much longer than 10 seconds to stick. With large receipts of over a 100 SKUs just the calculation of new costs take longer than 10 seconds to execute.

By the way, the one thing that does not occur here is a posting to our Daily Inventory Summary system. Notice the special * above. We set a flag but we don't do any actual work. The summary table is a daily stock ledger that is used to track various pieces of information such as beginning on hand, net sales, discounts, receipts, returns, markdowns and shrink. We decided it would take too long to update the core of our analytical system. And since the stock ledger is not a requirement for daily operations we reasoned that it could wait and run at a later time.

This is actually a clue to our design thinking. Operational events are linear and need to occur in the users'  time frame;  analysis and data crunching are not time sensitive.  Analysis can run a minute, a hour,  a day or even be regenerated a year from now.

So how can we speed up operations but deal will the need for real time information?

That will be the subject of our next post.