Development of ACBLscore+ discontinued

by Matthew Kidd
Original:
Updated:

Development of ACBLscore+ which was to replace the current ACBLscore program has been discontinued. Cessation of work was announced at the National Board of Directors open Q&A meeting at the summer NABC in Las Vegas. Many questions remain unanswered. The following document was supposedly created by Sharon Anderson and has been asserted to reflect current CEO Robert Hartman’s synopsis of the situation. I have seen this document in PDF form (page 4) on the D11 website and at one other location. Sharon Anderson was the ACBL president in 2012 and is still the D14 national representative.

ACBLscore and ACBLscore+ Update

  1. The ACBL has made the decision to discontinue the ACBLscore+ project. A working group consisting of Board members, internal staff and outside experts determined efforts to develop a completely new system ACBLscore+ fell short of what was required to work for the League’s members. The ACBL Board of Directors agreed with this decision at its July meeting in Las Vegas.
  2. The ACBL is now moving forward with plans to enhance the features and benefits of its existing scoring software, ACBLscore, and extend the life and functionality of the software for a long time to come. We are doing this because:
    • Our tournament directors, our club owners and operators and our members and players have confidence in the existing system, in its reliability, in its accuracy.
    • It is a system they know and have confidence in. We want to build on our success.
    • Folks know how to use it; they know it works on their computers; they are comfortable with what they know and we want to add on to it.
    • We want our club and tournament directors to be able to run games on their existing computers without expensive upgrades or complicated new lessons.
  3. The ACBL is already enhancing the scoring software to give it more features and to integrate it more easily into complimentary systems. For examples, we have integrated data population with TourneyTrax, and we are building integration to create real time result display features on the web.
  4. The total capitalized costs of the ACBLscore+ project is in excess of one and a half million dollars and we are determining what portion of the project product can be repurposed into the enhanced ACBLscore.

Overall, this is bad news. When the ACBL introduced ACBLscore in the 1980s they were ahead of power curve at a time when home computers were just escaping from the hobbyist realm into the general populace. But architecturally, the fundamentally DOS based ACBLscore has changed little since then even as it has been tweaked to support additional movements, masterpoint formula changes, and a plethora of special games. DOS is ancient history now, Windows is 64-bit (not 16!), Apple desktops and a variety of non-Microsoft mobile devices are common, and web browsers have become a cross-platform operating system of sorts for tasks that are not too CPU intensive. ACBLscore still doesn’t generate decent HTML output, hence the need for programs like ACBLmerge. That ACBLscore still runs at all on Windows is a testament to Microsoft’s backwards compatibility support, primarily for their business customers.

ACBLscore is largely the creation of one person, Jim Lopushinsky. My guess is that the ACBL has gotten good value from the fellow. But one day he will die, go senile, or just declare he doesn’t have the energy or focus to work on ACBLscore. Then the ACBL will have a real problem.

ACBL management is not ignorant of these issues. Over the years they’ve tried several times to modernize ACBLscore. By 2011 they seemed serious. In October they announced a bidding process and selected a winner in November 2012 with the contract to officially start on April 2, 2012.

What happened? I can only speculate. I spoke to Nicolas Hammond, who headed up the ACBLscore+ project, for over an hour on the phone early on in the project. He is a smart guy and his programming credentials are solid. Perhaps there were project organization issues. But my guess is that the work turned out to be more complicated than it first appeared. ACBLscore is probably one of those programs where unfortunately the program is the specification. Sure ACBLscore has user documentation but I’ve never seen any functional documentation, let alone a formal requirements document. As for the game file format, I may have the best publicly available documentation, which itself is far from complete given my desire to do other activities like playing bridge. Presumably, the ACBLscore code is under revision control but I’ll hazard a guess that the documentation for many changes doesn’t go beyond the commit log.

Most of the ACBL revenue comes from sanction fees rather than dues. ACBLscore tracks the sanction fees as well as other tournament financials. Making sure that functionality stayed intact was important to the ACBL if not to club users. This issue alone must have been a small Pandora’s box.

I also heard there were many complaints about the interface for ACBLscore+. I cannot speak directly to this because I did not try the ACBLscore+ beta versions or see any of the demos. But to be fair to ACBLscore, the application is a bit like the old text editor Emacs where data entry and manipulation is quite efficient once you have memorized the keyboard commands. It may be harder to achieve such efficiency in a web browser based application that must be supported for many browsers on many platforms. This consideration suggests the developers may have been better off using an application language such as Java or .NET. Java was in fact considered and rejected. Choosing .NET would appear to tie one to the Microsoft platform but a .NET application is much more portable than a native Microsoft executable (see Project Mono) provided one is careful which features of .NET are used. Moreover, in November 2014 Microsoft announced it was open sourcing .NET, such that even the most recent version will run cleanly on Mac and Linux.

Can any good come out of the ACBL’s decision? Maybe. It does give me more motivation to work on ACBLmerge. I’ve worked very little on it for the last two years because it seemed that the work would soon be for naught. Second, item #3 in Sharon Anderson’s document is promising. It appears that the pressure to improve the HTML output has grown strong, perhaps in no small part to the example set by ACBLmerge. ACBLscore can’t do worse than it currently does in this regard. I just hope the ACBL doesn’t half-ass the effort such that the output isn’t nearly as good as ACBLmerge but good enough that I don’t have motivation to continue ACBLmerge. I’m certainly willing to help the ACBL get it right.

Also, Nicholas Hammond did say that the ACBL had written a program to convert ACBLscore game files to an XML format. This itself is useful for anyone who wants to write software like ACBLmerge or the Payoff Matrix. But it seems as though the ACBL isn’t interested in releasing the code. Developers may instead want to refer to my ACBLscore Game Files Decoded reverse engineering effort or use my ACBLgamedump utility.

These consolations aside, the future of ACBLscore is a ticking time bomb.

Subsequent events

- ACBL headquarters in Horn Lake is flooded along with much of Desoto County. Governor of Mississippi declares a State of Emergency. ACBL Headquarters is closed until September 23rd. Online services remain available.

- Adam Parrish tries to investigate for Bridge Winners in a well written article but is unable to discover much. More than 300 comments later, many from well connected people, and we still know very little. Bridge Winners presented ACBL CEO, Robert Hartman, with a copy of the article before publication and includes his response at the end of the article.

Dear Adam,

I respect Bridge Winners as a vibrant online community where people who are passionate about our game can learn, share, and together gain a deeper understanding about a variety of bridge-related topics. That’s why it is disappointing that your opinion piece about the ACBL’s ongoing efforts to upgrade its scoring software draws on inaccurate information and fosters false impressions rather than understanding.

Many in our organization, including myself, have shared their time and perspective with you; some have corresponded with you to help you understand the decisions that have been made and the benefits they are expected to yield.

I am not going to correct everything I find inaccurate, imprecise, or misleading. Honestly, there is just too much. I am not going to dispute others who may not be quoted accurately, even if their alleged recollections are at variance with reality. However, there is one statement in your article that I absolutely agree with:

“…There was not anything glaringly wrong with ACBLScore, simply that it needed modernization…”

This is a true statement. Ultimately, the truth of this statement is at the core of the decisions that have been made this spring and summer to focus our efforts, energies, and resources on enhancing ACBLScore in a defined and achievable way.

Over the years, several attempts have been made to rewrite the ACBL scoring software. Other efforts have come up short despite everyone’s best intentions and efforts.

The fact is that we are on track to maintain the integrity of our scoring software, provide a solid foundation for future expansion, and deliver results that will work for our members promptly and affordably. Our current phase will be field tested at selected sectional tournaments before the upcoming NABC in November. Our plan is to roll out additional features and new functionality over time that meet our members' needs using a logical, phased approach.

Sincerely,

Robert Hartman

CEO
American Contract Bridge League

This is a pathetic response. Rather than providing clarity, Hartman dismisses the rumor mill for well being a rumor mill. The phrase, “Honestly there is just too much” is a dodge. Being a leader means prioritizing things and here that means deciding which items are most misleading, if that is really the case, and spending a few paragraphs discussing them—a doctoral thesis is not required. I’ve met Mr. Hartman in person. He was intelligent, young, had some understanding of technology, and was even willing to move from San Francisco to Memphis to take the leadership job. The ACBL is lucky to have him. The response above is beneath him.

- Nicholas Hammond, the CEO of Hammond Software which was hired to develop ACBLscore+, comments on the Bridge Winners posting, referring people to his posting Response to Bridgewinners/Facebook on the Bridge Score Plus website. His lengthy response has many interesting details for programmers and software project managers but he is circumspect about what happened between ACBL management and his company.

Mr. Hammond’s comments support the theory in the Bridge Winners article that the proximate cause of the ACBLscore+ discontinuation was due to contract negotiations over software ownership, specifically the ACBL seeking to renegotiate for exclusive ownership during the 2014 despite having been offered a discount for the non-exclusive ownership during the initial negotiation. But I find it hard to believe this is the full story. Companies that want full ownership will pay what it takes if they think it is important enough, for example Microsoft paid $8.5b and $2.5b for Skype and Minecraft’s parent company respectively. By contrast, an ethos of cheapness permeates the ACBL rank file, regardless of their income or net worth, and that ethos bubbles up to the ACBL leadership. I find it hard to imagine that the ACBL would care about exclusivity of the ACBLscore+ ownership if they felt they were getting a good deal. My guess is that they felt there were cost overruns and were trying to compensate for those by renegotiating for exclusivity.

The following excerpt from Mr. Hammond’s response has the ring of truth. I’ve edited very slightly to introduce acronyms at their first use for people unfamiliar with the software development process.

Change Requests

One thing that can quickly derail a software project is change requests. Both ACBL & HS knew that there would be changes during the course of the contract. We both wanted to avoid the costs associated with them.

For example, if ACBL had decided that the Help button should now be a yellow on blue button because someone read an article that some people are red/green color blind and may not see the Help button, we go through a process where client (ACBL) tells contractor (Hammond Software (HS)) that there is a change request (CR). HS asks ACBL to write a functional requirements (FR) document. ACBL writes a FR and sends it to HS. HS goes back and forth with ACBL to make sure that the FR really does cover all the requirements necessary. HR stops developers working on the project and gets their feedback on the CR and then asks a developer to write a functional specification (FS). The FS goes back to the ACBL. ACBL approves the FS. HS then gets estimated times from developers and creates a quote. Because of the CR overhead process, the mark-up is usually 5x to 20x the actual developer cost (sometimes substantially more). HS tells ACBL the cost to implement the CR. ACBL can’t believe the cost could be so much. ACBL/HS negotiate the cost. The lawyers from both sides get drafted to write an amendment to the contract. The costs go up. An amendment to the contract that covers the CR is finally agreed and written by the lawyers. ACBL goes back to its Board of Directors (BOD) because this is now spending money that was not allocated in the budget. BOD meets every 4 months so we all have to wait. Meanwhile developers go back to their coding but aren’t sure if the CR will be approved or the impact on other parts of the code. ACBL BOD approves amendment to the contract. ACBL contacts HS and informs them of the decision. HS contacts a developer who takes 5 minutes to change the foreground color and background color of an image. HS bills ACBL an unbelievable large amount of money because of all the overhead involved. We now have a very pretty yellow on blue image. Of course, this is now totally out of line with the rest of the colors of the project because this was not taken into consideration when we just looked at one small piece of the project. Someone from ACBL is later doing a crossword puzzle and looks up what tritanopia means. Oh dear. The process starts again. Even more money, even more time delays.

There were changes to the project made during the ACBLscore+ contract. This was planned and expected during the contract period. We knew there would be changes. We did not know what they would be. There was no extra billing from HS to ACBL for any of these changes.

From this I infer cost overruns and dissatisfaction on both sides. The situation described above is common and it is impossible to assign blame from these comments alone. The ACBL may have been better off letting Hammond Software deliver a functional ACBLscore+ and then having a big batch of change requests addressed in a subsequent release. Switching tasks as a programmer is costly. It takes time to pick up where you left off, anywhere from 20 minutes to several days depending on the complexity of the task, and this time directly translates into money.

- Robert Hartman releases a more informative statement. Also see the ACBLscore+ Fact Sheet, which I believe was released concurrently with Mr. Hartman’s statement below. I received both documents from Ken Monzingo, the D22 National Representative.

Comments by CEO Robert Hartman Regarding ACBLscore+
September 22, 2014

There has been discussion recently, online and elsewhere, about ACBL’s efforts to update ACBLscore, our bridge scoring software used in clubs and tournaments across North America. From comments I’ve seen and heard some folks are confused about our decisions and progress.

Please let me share the facts and try to cut through any confusion about what we have done, why we have done it and where we are going. Just so you know, I’m not relying solely on my own recollection of events, which is certainly clear, complete and accurate, but also on the record of events as chronicled by our staff and League Counsel.

More than two years ago, in April 2012, the ACBL signed an agreement with an outside consultant. The agreement tasked the consultant with the responsibility to develop, test and provide our non-profit membership organization with code for a software program to replace ACBLscore. The action followed extensive study and recommendation by the ACBL Board to management at the time.

The agreement called for items to be completed and tested for the new code, known as ACBLscore+. The agreement laid out specific time frames, six key performance milestones and provided specific fees for each phase of the project.

In the summer of 2013, the project fell behind schedule when the fourth milestone was not met. During the next six months, everyone involved attempted to get the software development project back on track.

Earlier this year, in February, however, ACBL management determined that the ACBLscore+ project was significantly behind in development and testing ……significantly behind…and the prescribed $1.4 million consulting fee had been paid in full.

As a result, as CEO, I assembled an advisory group of authoritative experts from the business community, from the ACBL Board and the ACBL staff. The assignment for the advisory group was to review what the consultant delivered and render an independent judgment and recommendation about the viability of continuing the ACBLscore+ project.

After focused study and deliberation, the advisory group found that the delivered work product did not meet the objectives spelled out in the agreement. The group recommended the agreement be terminated. And the agreement was terminated in April 2014 - Terminated. By the way, the members were unanimous in their findings and recommendations.

Now, questions have been raised about the ownership of ACBLscore+. Let’s be clear. The ACBL paid for development of this code. Under the agreement - in clear language - the ACBL owns the ACBLscore+ code. There are no ifs, ands or buts. If the consultant "owned the copyright" to ACBLscore+ as has been claimed in some published accounts, then why did the ACBL actually grant him a license to use the new code in the agreement?

Be assured, if the ACBLscore+ program had been completed and published, the ACBL would have obtained copyright protection by using the copyright symbol and terms of use language similar to those developed by outside copyright counsel for the recently published “Learn to Play Bridge” program. Since ACBLscore+ was not completed, fully tested or published, the copyright symbol and terms of use language could not be integrated into the ACBLscore+ program.

Phase one of a refreshed ACBLScore Live will be field tested at selected sectional tournaments before the upcoming NABC in November. And we plan to roll out additional features and new functionality over time using a logical, phased approach.

Software development is complex. In business and in life, sometimes plans do not work out as you hope. This has been one of those cases. Through it all, the ACBL has endeavored to be good stewards and serve our members well. And ultimately, that’s what all this is about.

The initial tone of Hartman’s statement is poor. He refers to “confusion” in the abstract as though it was completely unrelated to the ACBL’s failure to provide clarity two months earlier. This is followed by a legalistic paragraph. Finally we get into the meat of the matter from the perspective of the ACBL, namely that the project fell behind in both development and testing. Unfortunately, this isn’t an uncommon occurrence in software development. I don’t know why the ACBL didn’t come out and state this immediately. One possibility is that the ACBL was partly complicit by failing to provide certain information to Hammond Software in a timely manner as might be inferred from a full reading Nicholas Hammond’s Response of September 22, 2014. But lacking a detailed timeline of who did what, this remains speculation.

The comments about the ownership of ACBLscore+, discussed toward the end of Hartman’s letter, remain murky. The ACBL seems to be ignoring the distinction between exclusive and non-exclusive ownership. My understand from Nicholas Hammond’s comments is that the initial contract granted the ACBL non-exclusive rights to the ACBLscore+ and the source code, with the ability to modify the code on their own and release derivative versions to anyone on any terms they wished. But the non-exclusivity would not prevent Hammond Software from releasing ACBLscore+ or derivative works, whether or not done under contract for the ACBL. It is not clear to me whether they ACBL got cold feet about this provision or simply wanted to ensure that releases and subsequent derivative works provided directly by Hammond Software used a different name (and presumably a different logo) than ACBLscore+.

- November 2014 issue of the Contract Bridge Forum arrives. On page 1, D22 National Representative, Ken Monzingo stated the following in his District Direction column. (Curiously the wording online doesn’t quite match the printed version; the following is the printed version.)

Ron Nessen, former White House Press Secretary for Gerald Ford, once said: “Nobody believes the official spokesman … but everybody trusts the unidentified source.” That scenario appears to be upon us now in ACBL land. CEO Robert Hartman has recently cancelled an expensive ACBL Score+ project and chosen to funnel those monies and efforts into in-house improvements of the present Score – “Score” is the software program directors use to run bridge games, etc. The naysayers are crying foul – much more interesting than just nodding acceptance.

I believe we are all occasionally guilty of jumping to conclusions without facts to achieve a result we would think is more fun than looking at the (sometimes boring) truth. Isn’t it more exciting to blast away at someone (something) to make them look bad, than to gather in a circle, have a beer and laud how good people are? Trust in our national politicians is at an all-time low in rhetoric … and somehow this has spilt over into our bridge association.

Before I made my way onto the national board of directors, I admit I enjoyed engaging in a little ACBL-bashing. I didn’t have any facts, mind you, but surely wanted to be a member of the “we know all” crowd. Now, after years of dealing with the myriad of issues in governing this 75-year-old successful association, I have gained enormous respect for the efforts of the board’s approach to better – or maintain status quo – of bridge in the new world circa 2014. And also with our managements’s improvements on all aspects of ACBL – including the Score upgrade.

Questions: People ask me why I have not taken a public stand on the ACBLScore+ $1.5M cancellation, since I’ve been so vocal about our International overspending. I can give you two answers: facts and trust. I do not have definite facts that anything more than a bad business decision went down, causing the termination of the attempt to improve the current ACBL Score (which was written decades ago). And often, maybe naively, I choose trust over accusations – especially those lacking sufficient smoking guns.

More Questions re what happened: Poor initial contract? Insufficient early research on who was best to attempt the massive project? Failure to put it out to bid? And/or waiting too long to cancel when production schedules weren’t met? And finally, inability to communicate the situation in understandable lay language?

I give you my three answers:

No. 1. My opinion: yes on all (or most) of the above.
No. 2. I have few facts to back up my opinions.
No. 3. I trust management’s decision to terminate, and their determination on in-house improvement of current ACBL Score.

Unclear? Yes. If some on the national board are still somewhat uncertain whether this was just a failed business contract, or was mishandled, or simply a heroic decision to stop the bleeding before incurring more losses, then how can those on the outside have such certainty in their accusations of mismanagement?

Does this column clarify it? Of course not. But I owed you an answer (opinion). As/if facts and intel change, so will I.

This column seems written in direct response to an e-mail conversation I had with Ken. Those people asking him why he hasn’t taken a public stand on the ACBLScore+ $1.5M cancellation, since,” he has “been so vocal about our International overspending,” may well be a singular person—me. I never asked him to take a public stand. I merely told him that the rank and file are owed an explanation:

Subject: ACBLscore+ discontinuation
Date: Thu, 25 Sep 2014 10:36:58 -0700
From: Matthew Kidd
To: Ken Monzingo

Ken,

The rank and file are owed an explanation for the sudden discontinuation of ACBLscore+. The $1.5m spent, and presumably largely wasted, is about nine years of the unnecessary WBF dues (~$160k per year) that you have been fighting so hard to end.

- Matthew

The explanation may well be the prosaic and common story of a software project that was over budget and behind schedule. If this is the full story, I don’t know why ACBL management hasn’t been clearer in communicating the message. Is it an unfortunate story? Yes? Somewhat embarrassing? Yes? A sin? No. A capital offense? No. Far bigger entities than the ACBL have been through similar situations–Exhibit A: SAIC’s bungling of the Virtual Case File (VCF) for the FBI which set taxpayers back more than $100 million. But the VCF disaster was not entirely SAIC’s fault. Cited as part of the problem were, “a deeply flawed 800-page set of system requirements that doomed the project before a line of code was written.” I find it plausible that the ACBL deserves some of the blame for similar reasons.

One thing that bothers me about Ken’s article is frank admission that months later the national board members don’t know what happened. It’s not even clear they are interested in knowing. It is as though the country were in a war and Congress had little interest in even the big picture of the political objectives, lives lost, money spent, territory gained, degradation of the enemy’s resources and so on.

I’m also disturbed by the assumption that people asking for an explanation are “naysayers”, “crying foul”, or “jumping to conclusions.” Sometimes a request for an explanation is just a request for an explanation and perhaps a degree of reassurance that something has been learned from the debacle.

Ken is big on trust, even blind trust. I feel trust is earned based on past experience. As I told Ken privately, I trust the ACBL to run tournaments well. But that is because I have been to many tournaments and none was ever a disaster or even just poor. I also trust ACBL tournament directors to be competent by and large. Sure I can think of a couple of negative incidents but they were infrequent. But the ACBL’s track record on technology is not great. It’s not a disaster but it could be much better. My trust has not been earned and the National BoD is naive if they think their trust has been earned.

Footnote: Ken pointed out that my $160k / year for International spending is only one of several related expenditures that he is fighting against such that the $160k is a significant underestimate. But this is a story for another day in another place.

- Motions for the Fall 2014 (Providence) National Board of directors are posted on the ACBL website.

Item 143-150 concerns ACBLscore. It reads:

The following steps be taken regarding ACBLscore:

  1. Management shall immediately cease all new development work on ACBLscore (except for changes required to implement new masterpoint rules for 2015 or changes to fix known “bugs”).
  2. A complete and independent review of our progress to date shall be performed with an evaluation of the available options. This review shall be done on a high priority basis by a group of highly competent technical people capable of analyzing and evaluating the decision(s) being considered and recommending how best to move forward. The President, with appropriate input from outside technical experts as well as other board members and members of staff, shall be charged to select the committee and determine a reasonable target date for completion of the review.
  3. This committee will be charged with, at a minimum, evaluating the following:
    1. Determine whether the proposal that we enhance the existing ACBLscore is the best option.
    2. Evaluate what to do with the product already received from Hammond Software. Options to be considered should include whether to discard it or if the system written thus far performs well enough to be completed at a reasonable cost.
    3. Evaluate whether the ACBL has sufficient people with the experience and expertise required to support building a system of this magnitude. In addition, evaluate whether we have staff talent capable of managing this very large effort.
    4. Determine which desirable enhancements to the existing ACBLscore system should be implemented on a high priority basis and which can be deferred to a later deliverable.
    5. If the recommendation to enhance the current system is considered viable, review and evaluate the detailed plan that the CEO promised would be delivered in November to ensure that the total costs and estimated time frame are acceptable and well documented. Also, propose methods to ensure that critical enhancements will be delivered within a reasonable time frame and at a reasonable cost.

    The group shall also document long term goals, a detailed plan to reach these goals, a time line with cost estimates and short-term goals with suggested target dates.

Effective Immediately

I don’t know who requested this motion.

If passed, this motion stops the plan to continue ACBLscore development put forth in Robert Hartman’s September 22, 2014 statement above.

- 30 ACBL members with a strong background in computer science submitted a letter to the National Board of Directors shortly before their meeting at the Nationals in Providence, RI.

To the ACBL Board of Directors:

We are bridge players and software development experts. From observing recent decisions made regarding ACBLscore and the explanations given for them, it appears to us that the Board of Directors (BOD) is not being well advised on matters involving technology. Because technology decisions are potentially expensive and have wide-reaching implications for how our game is played and perceived long into the future, making informed decisions about technology is essential. We therefore recommend that the board create a Technology Committee that includes non-BOD bridge-playing technical experts, and empower that committee to recommend actions to the BOD and management.

We have not studied all of the issues pertaining to the decisions relating to ACBLscore, but given the general facts of a 30-year-old system written in Pascal for DOS, which is having compatibility issues with new hardware (electronic scoring devices including BridgePads and BridgeMates) and operating systems (Windows 7 & 8), we would cast our vote in favor of developing new software. Most importantly, though, the technical arguments being made to support keeping the existing software are not valid, and we implore those in power not to make decisions based on them.

As we understand them, the ACBL’s arguments in favor of updating ACBLscore rather than continuing the project to build ACBLscore+ are that Pascal is not an antiquated language, that the current software can be added to and upgraded forever, and that sticking with the current program will benefit users because they do not have to learn a new interface or buy a new computer.

We reject all of these arguments. Pascal is not dead, but it is antiquated and out of fashion, and finding programmers familiar with it is a challenge. The current program may technically be capable of some new features, but expecting programs written before the Internet came into existence to integrate into the web-based, mobile-oriented, modern world is not realistic.

Making changes to software — especially the fundamental user interface — always presents challenges for veteran users, but this is a necessary cost of innovation and advancement. Today, software is generally evaluated on how easy it is for new users to get up to speed, not how easy it is for experienced users to navigate. A modern, web-based interface brings instant familiarity and makes a new program much easier to learn than the text-based DOS infrastructure of ACBLscore.

No hardware upgrades should be necessary to run a web-based application — every computer from the last 15 years has a web browser. ACBLscore+ should run on almost any computer that can run ACBLscore — except an ancient DOS machine. The same cannot be said the other way around: ACBLscore cannot run on many computers that could run ACBLscore+, most notably computers not running Windows or DOS (including Macs, Chromebooks, phones, or tablets). The DOS version of ACBLscore (which we are told includes features the Windows version lacks and is preferred by most TDs) will not run in the standard 64-bit Windows 7 or 8. An appropriate balance between future-proofing software and making sure that everyone can use it today must be struck, but one should always err on the side of compatibility with newer technology.

This is not intended as a complete technical analysis of ACBLscore or ACBLscore+; it is merely an observation that decisions have been made based (as far as we can discern from public disclosure) on bad and incorrect technical arguments. The board should not be expected to have expertise in every area, but it should receive reliable outside expert opinions when its own knowledge is lacking, especially when such opinion is readily available within the ACBL’s membership. We anticipate no shortage of qualified volunteers to serve on a Technology Committee. In the meantime, we urge the BOD to demand valid arguments if management insists upon abandoning the new ACBLscore+ software and would gladly answer any technical questions of members of the BOD.

Sincerely,

Greg Humphreys, Ph.D. Computer Science, Stanford. Formerly Professor of Computer Science, University of Virginia. Currently Staff Software Engineer, Google and Chief Technology Officer, Bridge Winners.

Chip Martel, Ph.D. in Computer Science, Berkeley. Professor (and former chair) of Computer Science (now Emeritus), University of California at Davis

James R La Force, EdD, Systems Analyst, Systems Program Manager, USAF and industry.

Ping Hu, PhD in physics. Over 20 years experience in software development. Have held senior development positions including Intel, Lucent Technology and Navigation Technology. Current work in Thomson-Reuters.

Max Aeschbacher, Associate Professor and Department Chair, Developmental Mathematics, Utah Valley University.

Jon Gustafson, B.S. Physics, M.S. Computer Science Colorado State University. Designed and implemented languages and compilers for ecosystem simulation at CSU. Retired software architect for Hewlett Packard.

Eugene Hung, PhD in Computer Science, University of California, San Diego. Senior Software Engineer (>10 years experience) @ IBM, and Chief Editor, Bridge Winners.

Sriram Narasimhan, Ph.D. Computer Science, Vanderbilt University. Computer Scientist at NASA Ames Research Center. Designed and maintain websites and applications for bridge federation of India.

Kevin Lane, Ph. D, MBA, information technology professional/advisor for major corporations including FedEx, Yahoo, Microsoft and also various ACBL organizations.

Matthew Kidd, Ph.D. Physics, UIUC. 20 years experience in bioinformatics. Chief Software Architect at Rosetta Inpharmatics. ACBLmerge developer.

Peter Friedland, PhD Computer Science, Stanford University, Former Chief Technologist NASA Ames Research Center, Founder and CEO three software companies, Senior Advisor on Technology United States Air Force, Chair, Technology Commission, City of Cupertino, California.

John D Stiefel: Director of development for the LTD (Long Term Disability) system at Aetna, and director / merger coordinator for the Valuation and Repetitive Payment systems.

Ben H. Franz BS Computer Science, Ohio State University 1971. Senior Staff Software Engineer Measurex Corp. Senior Engineer Bell Northern Research. 20+ years experience maintaining, expanding complex software systems.

Jeff Ford, Ph.D. Computer Science, University of Texas at Austin. Currently Staff Engineer, Context Relevant.

Phil Clayton, PMP. IT Project Manager at Toyota Financial Services.

Bob Lafleur, career software designer, developer, consultant.

Ankur Rathi, B.S. Computer Science, University of Texas at Austin. 5+ years IT experience, including maintenance and replacement of legacy systems at American Airlines.

Robert Brady, BS Computer Science UVa.

Robert I. Eachus, MS Operations Research and Statistics, RPI, member of several ISO/IEC JTC1/SC22 Programming Language technical committees, as a Lead Software Engineer at MITRE, advised the government on several large software acquisitions.

Aaron Cavender, B.S. Comprehensive Mathematics (1989), B.S. Computer Science (1990) University of Arizona, 25 years professional software development and management.

Ilya Kuzkin, Senior Data Modeler. M. Sc. Applied Mathematics Moscow State University (1997). Database Expert (Oracle Certified Professional) with 18 years of IT experience, including technical architect roles.

Andrew Hoskins B.S. Computer Science, Carnegie Mellon University. Software Engineer, Facebook.

Adam Meyerson, Ph.D. Computer Science, Stanford. Software Engineer, Google.

Georgiana Gates, 40+ years in the computer industry, mostly in oil & gas, specializing in database programming.

Ralph Lipe. Former Microsoft Partner, and architect for Windows 3.x-Windows 95. Former Google Senior Software Engineer.

Peiyush Jain, MS Computer Science, University of Texas at Austin. Software Engineer, Google.

Michael Flaster, BS Computer Science MIT, MS Computer Science Berkeley. Staff software engineer, Google.

Dinah McNutt, Software engineer, Google.

Brian Tivol. Senior Software Engineer, Google. BS Mathematics with Computer Science, MIT.

Robin Hillyard, Ph.D. Computer Science, Cambridge University; Vice President and Senior Research Fellow, Optum.

- National Board of Directors reportedly passed a modified version of the Technology Motion (Item 143-150). See December 29th update.

- Gary Hann reports in a Bridge Winners discussion that there was a “heavy and lengthy questions and discussion about the $1.5 million ACBL Score debacle” in Providence. Also from the same discussion, JoAnn Sprung reports:

I can report on my exchange with Mr. Hartman. After he gave an overview of the Bridgescore+ debacle he said something rah rah like let’s not dwell on the past, we're going to roll up our sleeves and get to work. He said that he has formed a technology committee to investigate further. He mentioned that the committee would be populated with board members, staff and technology advisors. Naming Greg [Humphreys] and Uday [Ivatury, CEO of BBO] as members.

I asked him who was chairing the committee. He said that he was. I asked if the staff members who decided to trash the program would be on the committee. He danced a bit but acknowledged that they were on the committee. (Both points were left out of his overview.)

I asked how the membership could have confidence in the results of the committee to investigate his decision if he was chairing it and suggested that he step aside in the interest of transparency and membership confidence. He said, “I disagree.” Wow that was comforting.

When I asked him how he could justify spending in excess of $200,000 on ACBL Live when other tested and universally accepted software was available to the league for nothing or very little expense he replied: “We want to control our own destiny.”

A great marketing phrase that when boiled down says nothing. So I said that it seems that they want to spend money developing programs internally just to have it done by you rather than going outside. He replied: “Now you are putting words in my mouth.”

Is the BOD totally snowed by this guy? Seems so…

- The Technology Committee has been created. It will be chaired by CEO Robert Hartman and include three National BoD members (Jay Whipple, Russ Jones, and Merlin Vilhauer), three outside members (Greg Humphreys of Bridge Winners, Uday Ivatury of Bridge Base Online, and Ralph Lipe, a former system architect at Microsoft), and three members of ACBL management (Tony Lin, Ken Howedel, and Bruce Knoll).

Adam Parrish has more in the new ACBLscore Update thread on Bridge Winners. Mr. Hammond has contributed detailed comments to the post. Here are some revealing excerpts, edited to use the full phrase when introducing acronyms for the first time.

ACBL have been very, very, careful to state that the “ACBLscore+ Project” was a failure, but they have not said the same about the software developed under the “ACBLscore+ contract”. When they have made comments, they have been very careful in what they state. They may state that parts were not complete (true), but won’t say why (no specs delivered). I still stand by all the work done during the “ACBLscore+ contract”.

ACBL wanted to change the terms of the original contract starting mid to late 2013. This was over a year into the contract. I think they had used outside counsel for the Learn to Play Bridge (LTPB) contract and were reviewing all their existing contracts. ACBL then stopped paying invoices. I was very surprised when Robert Hartman admitted to this in the BOG meeting in Providence last month. He didn’t say why they stopped paying, but ACBL went several months without paying invoices. When they did finally pay (after much legal work on our end), they backdated some of the checks (checks dated a certain date, but not mailed out until later). Do an audit, and some of these delayed payments won’t show up, until you look at the date they were cashed. We ended up photocopying date-stamps from the envelopes when the checks finally arrived. Please note that all of our contractors/employees were paid on time; Hammond Software (HS) had to take out a substantial loan to make sure they were paid. Robert did not state the reason, but the payments stopped when they first wanted to change the legal terms of the copyright of the original contract.

BTW, this non-payment of invoices had a direct impact on the contract work. We did put some people on “furlough”, unfortunately some of them were working on critical path items, so it did have a significant effect. It wasn’t up to us to fund this project.

Hammond Software (HS) terminated the ACBLscore+ contract effective March 31, 2014. It may be better to say that this was a mutual termination date (we agreed the date with ACBL) as there was a clause in the contract for mutual termination. Very easy for ACBL to terminate at any time (15 days notice after completion of any phase with no explanation needed). Much harder for us to terminate. Let’s politely say it was a mutual termination date.

On our end we used various program management (PM) techniques, including heavy use of Agile. It worked very well for us. For interactions with ACBL we had to use more traditional PM methods. At one point I was told that someone at ACBL was going to track all items using pencil and paper (I’m not kidding). …

The first final ACBLscore+ status report (March 31 2014) was 141 pages. The May 2014 was 145 pages. Included in these status reports were the state of the project, what worked, what didn’t, what we were waiting on, etc.

Unfortunately ACBL’s position is that it wants to “control its own destiny” to quote words from the BOG meeting. There is a very strong NIH (Not In House) mentality at ACBL HQ. While this mentality persists, there will be little new development.

The good news is that all the good ideas from ACBLscore+, e.g. Fast Results, use of projectors, automatic assignment of tables in KOs/Swiss are going to be implemented. We just don’t know when, or how much the final price tag will be ($600K was quoted). But they will be developed in-house. Unfortunately the in-house experience isn’t good.

- Donald Mamula, the D19 national representative, states on Bridge Winners that Item 143-150 was withdrawn by the maker of the motion, thus no full board discussion, vote or action. Mr. Mamula’s full statement, spread across two comments on the same day, was:

As a matter of clarification: This technology committee was established by the CEO, who determined its composition and purpose. The existence of this new committee was revealed to the BoD towards the end of our Providence meeting (IIRC, it was on Wednesday [November 26th]). At no time did the BoD take any formal action in regards to this committee, thus there is no “motion text”, only the verbal announcements of its creation to the BoD and later to the Board of Governors. The committee has been granted no powers or authority by the BoD and does not represent the BoD, other than having three board members serving as committee members.

The journal motion was withdrawn by the maker of the motion, thus no full board discussion, vote or action. Since the BoD took no action, the only committee currently addressing this issue is the one appointed by the CEO. Whether or not the board has its own 2014 technology committee is a question that will be answered when the 2014 president announces the board committees and their composition.

I can state that the announcement was made shortly before discussion and voting was to take place on the journal motion. Beyond that factual statement, anything else would be pure speculation, and I decline to do that. I hope you can understand my desire to only present facts with as little conjecture as possible.

It appears that ACBL management was trying to get ahead of the BoD. This development unfortunately means the Technology Committee reports to management rather than the BoD. Hopefully this does not limit the scope of its discussion and investigation.

- The maker (and withdrawer) of the Technology motion was Rich DeMartino (D25. New England). The statement below is excerpted from his December 2014 district director report.

Many of you know the recent status of ACBL Score but I will provide a brief summary. The new ACBL Score+ was initially scheduled for completion in mid-2014 at a cost of $1.5 million. As of November 2013 (about 1 1/2 years into the project), we were advised we were 2-6 weeks behind schedule. In March, we were told the project was substantially behind schedule. As a result, the CEO selected an advisory group to make an independent judgment and recommendation as to the viability of continuing the ACBL Score + Project. The advisory group unanimously recommended that work on ACBL Score+ be terminated and that we enhance the existing 32-year old system.

When this decision was announced, numerous highly qualified systems development experts criticized the action, primarily because they believe the plan to enhance our 32 year old DOS based system is a very poor option. As a result, I submitted a motion that recommended we stop work on updating the old system and conduct an independent review by outside systems experts to evaluate the decision and to decide how best to move forward. When I learned of the plan described below, I chose to withdraw my motion.

The proposed plan is to have a committee evaluate the current ACBL Score plan both to determine whether enhancing the existing ACBL Score is viable and, if so, to evaluate the work planned for 2015. The Committee will have three Board Members, including Jay Whipple, three members from ACBL Staff and three outside systems experts (bridge players). Jay assures me the three outside systems experts selected will be outstanding and he further assures me he is confident the group is fully capable of carrying out the task assigned.

- The Technology Committee met (see minutes). The key points regarding ACBLscore+ are:

  • The committee has reached consensus on ACBLscore+ based on review of code, documentation and conversations with the developer.
  • Developer is estimated to be up to 18 months from completion of the project.
  • Management Technology Committee believes the best path forward is not to continue work with Hammond Software. Committee will embark on a process to design a plan forward that best meets our Stakeholders needs.
  • Gap analysis performed by ACBL should have done a better job breaking out the larger, critical items from the more minor issues.
  • ACBL should have more actively managed the developer and should have been able to identify the issues with the developer earlier.

These conclusions are significant because it appears consensus includes the three outside experts, even if the word “unanimous” is not used. In any case, outside experts Uday Ivatury and Ralph Lipe have publicly stated on Bridge Winners that they essentially agree with statements being made by outside expert Greg Humphreys (Mr. Lipe’s comment, Mr. Ivatury’s comment). Unfortunately, the minutes don’t include any specifics on why the Technology committee believes the best path forward is not to continue work with Hammond Software. Nor is the role of the ACBL in creating this situation detailed beyond the last two items above. Bridge Winners is the only source of additional information and that information is almost entirely coming from Greg Humphreys and Nicolas Hammond.

Outside expert and committee member Greg Humphreys commented on Bridge Winners:

The 18 month number was something Nic[olas Hammond] stated to me and [committee member] Uday [Ivatury] in a phone call. The meeting minutes don’t capture this, but that estimate was wall-time for more than one person.

No disagreement here that the ACBL could / should have managed this project much, much better on their end. If they’re not convinced of this by now, I don’t know what else can be done / said.

and went on to say:
Did the ACBL manage this project poorly on their end? Yes, absolutely, and that’s probably an understatement. Things never should have gotten this bad, and much of that is on them. Did the delivered product fall very fall short of what was required by the contract? Yes. There’s plenty of finger-pointing to go around on that front, and since I don't have a time machine, I doubt I’ll ever really know who’s telling the truth about everything. I think we've reached a point where apportioning blame is less important than finding a workable solution moving forward.

Then Greg Humphreys and Nicolas Hammond got into a long and tedious pissing match on Bridge Winners. Some of this was due to a misunderstanding that took three days to sort out. It turns out that Greg was not evaluating the latest version of ACBLscore+. And the fact that only one outside expert, Ralph Lipe, is familiar with using ACBLscore to direct games, has led to further confusions. They are all smart guys and given the time to sort things out I think they would get there but it seems the committee has already chosen to reject working further with Hammond Software.

As for the code that currently exists, the outside experts have favorable comments: “The things that it *can* do, it appears to do quite well” (Greg Humphreys), “I haven't looked at all of it [the code], but what I’ve seen appears pretty good to me” (Greg Humphreys), and “Most of what I have looked at seems well coded” (Ralph Lipe).

- Nicolas Hammond responds to Greg Humphrey’s request for what it would take to finish ACBLscore+. His answer starts about halfway down a very long comment on Bridge Winners. Search for “You wanted a full” to find the start of his list. Here is a condensed version of Mr. Hammond’s list:

  1. Need to complete masterpoint calculation code. 95% done. Completion blocked on receipt of technical details from ACBL. Note: Information publicly available in Masterpoint Award Rules & Regulations and Chapter 4 of the ACBL Handbook of Rules and Regulations is not detailed enough.
  2. Need to support a few more movements, including some Howells. Percent completion unclear.
  3. Need to implement manual editing of movements (EDMOV functionality in ACBLscore). Barely started.
  4. Tournament Finance Reports. 50% progress? Significant late Change Request by ACBL significantly set back effort (ACBL decided tournament directors were “too old” to learn Excel, a product that is now 28 years old itself!)
  5. Club Finances. Small amount of work left. Significant testing required.
  6. Testing of masterpoint code for special games, e.g. charity games. Seems like it should be quick since these are based on the tournament masterpoint calculations with a different scaling factor (the R factor).
  7. Program help and documentation. 20% done?
  8. Supporting of all club types. It not clear what this means. Small amount of work.
  9. Legal issues. Seems to be unnecessary problems created by the ACBL.
  10. Minimum system hardware supported must be decided. Probably ACBL’s fault that this is not finalized.
  11. Minimum OS version supported must be decided, e.g. support XP or not. Probably ACBL’s fault that this is not finalized.
  12. Finalize Bridgemate/BridgePad/BridgeScorer support (for pair games). 50+% complete?
  13. Support Swiss teams on Bridgemates. New request. Not started. Could be added to a later release of ACBLscore+.

Mr. Hammond cites item #3, as a significant difficulty. I’m not very experienced with the EDMOV functionality from a director’s perspective but I have some familiarity with it from digging around the guts of ACBLmerge and I try to support EDMOV changes in my ACBLgamedump software. EDMOV is not commonly used. It seems like the most common EDMOV cases could be implemented in an initial ACBLscore+ release with further adjustments down the road.

- Bridge columnist Frank Stewart alludes to the ACBLscore+ debacle in his daily column:

Amid the [2014] ACBL Fall Championships, the natives were restless. Concerned players were grumbling about—among other things—shortcomings in the league’s technology, including a $1.9 million outlay for a new scoring system that became a fiasco.

- The Technology Committee held a Q&A session at the New Orleans NABC. In a victory for transparency, the ACBL allowed the event to be recorded and committee member Greg Humphreys posted the video to YouTube. The members from left to right are: Robert Hartman (ACBL CEO), Tony Lin (management, software architect), Bruce Knoll (management, IT director), Ralph Lipe (outside expert from Microsoft), Greg Humphreys (outside expert from Bridge Winners), Jay Whipple (D9 director, Florida), Russ Jones (D10 director, South), Ken Howedel (management, software project manager), Merlin Vilhauer (D20 director, Oregon area), and Uday Iventory (outside expert from BBO). Note: Uday is out of the frame most of the time.

The meeting was two hours long. The following is a discussion of key points. Important quotes are presented with minimal editing and are accompanied by links to the specific time point in the video where the statement is made.

Interesting historical tidbits

Merlin Vilhauer is a real old timer. He wrote the first ACBL scoring program, used from 1980–1991, at which point Jim Lopushinsky was hired. For nine minutes before the meeting starts, you can hear Merlin telling Uday about the very beginning of computer scoring. Merlin mentions the ACBL using a 45 lb computer called the North Star Advantage. With a Z-80 CPU, 64K of RAM, and 180K 5¼" floppy drives, running CP/M, the machine is functionally similar to the TRS-80 Model III, the first computer I purchased for $850 from a private seller using profits from a newspaper route. The ACBL made a quick transition to IBM PCs in 1982 or 1983. Early on, memory was a significant limitation. Twenty one separate program modules had to swapped in and out of RAM to run his scoring program and the swapping had to be manually planned to avoid collisions. Merlin said many of the ACBLscore screenshots are similar to his original program and that the score entry style of say four-two-minus for -420, where the trailing zero is dropped and the minus key serves the role of the enter key for negative N-S scores, goes back to him.

Topics not directly related to ACBLscore

Jay Whipple said (at 53:40), “I’m very encouraged by the work that being done on [ACBL Live].” ACBL Live is the ACBL’s in-house project to provide the functionality currently provided by Fast Results and Bridge Results.

CEO Robert Hartman said (at 1:02:06), “Members of this committee are helping vet CIO resumes” and “We are trying to get the CIO on quickly before this committee gets too far down the road.”

Electric payment seems likely soon. There was mention of a partnership with a subsidiary of Paypal, confirmed to be Braintree at the Board of Governor’s meeting on March 15th, for credit-card processing. In addition to Mastercard and Visa, payment options will include American Express, Apple Pay, and even Bitcoin.

Core ACBLscore matters

Greg Humphreys’ summary of the committee’s ACBLscore+ assessment to date begins at 17:38. He backs up management’s position in saying that the original ACBLscore+ contract called for a “complete replacement of ACBLscore” and that the software delivered cannot do that, noting in particular the lack of full support for pair games and problems with tournament finance reporting. Greg noted that the knockout team game functionality seemed to work really well and that people who had used this functionality were very happy with it.

The committee is in unanimous agreement that ACBLscore is at the end of its lifespan and that it needs to be replaced in the next 2-5 years. Listen at 1:20:37.

In response to issues with implementing the January 2015 masterpoint change in ACBLscore, Ken Howedel said (at 29:15), ”[the] program [ACBLscore] is very patchy. [Version] 7.9.1 will be used at this tournament [the NABC].” This version is not yet available for download.

The committee killed management plans to enhance ACBLscore. Rethinking several technology projects reduced a $484k budget allocation to $113k. The committee is resolved that any enhancements to ACBLscore must be modular so that they can also be used in the successor program.

Why was ACBLscore+ discontinued?

For the short answer, see Greg’s statement in the last section.

Russ Jones (at 45:20) mentioned going through a checklist:

We had a list of requirements of this is what we are looking for. Can it do these things? And so we went through there and tried to do movements and tried to do masterpoints and things that were missing because this was supposed to be the end project. This was supposed to be finished. So we looked at that and said those things are not done… And on top of that what about look and feel… It’s a totally different UI… I was looking at it from the perspective of the 3000+ club members… no function keys, no shortcuts, no anything… It didn’t have some user friendliness… Can I find these help screens?… no."

There was some discussion about whether the ACBL performed their evaluation on that last delivered code or an earlier version. CEO Hartman (at 31:34) said:

We saw many demos… didn’t see final product demoed. That was probably a mistake.

However, Greg Humphreys has examined the final delivered code. I don’t think the ACBL would have reached a different conclusion if they had seen the final code.

Can anything be salvaged from ACBLscore+

This is a hard question. As a software guy myself, I am sympathetic to the committee’s difficulty in giving a firm answer at this point.

Greg Humphreys said (at 58:55 and 59:51):

You don’t want to end up with some sort of Frankenstein monster of a system in the end… you don’t want this cobbled mash of stuff… [hence] my reluctance to recommend doing something like rolling out the KO portion of ACBLscore+ which does work fine.

A lot of the stuff that is useful in [ACBL]Score+ that I’ve found is these separate command line utilities that don’t run in the cloud and are not platform specific… specifically interchange with [ACBLscore] game files... bit for bit compatible with ACBLscore. I’m pretty eager to find out exactly what they are.

In response to a question about accountability from Allison Brandt, Greg stated (at 1:03:49):

I can’t hold anyone accountable for anything since I don’t work for the ACBL. But one of the things I’m trying as hard as I can do is actually to minimize the waste you’re describing ’cause it would be you know it sure sounds like a waste to spend a million plus dollars and then just throw the whole thing away. I don’t think we need to throw the thing away.

It’s not going to get rolled out as delivered. It can’t be. But I really believe there’s a lot of things in there that would bring enormous value to whatever we build going forward. That’s why I’m spending so much time trying to understand exactly what’s there so that all the stuff that got paid for actually get used as much as possible. That’s not anything about holding anyone accountable but it will certainly minimize—I want to minimize waste because we certainly paid for something and we ought to make as much use of it as we can.

What form should the ACBLscore replacement take?

ACBLscore+ as developed by Hammond Software is a web based application. Since many clubs game are run where there isn’t an internet connection, ACBLscore+ runs a local web server which the ACBL has taken to calling the Personal Web Server model. I had doubts about this model from the get-go, particularly in terms of responsiveness because there is significant overhead even in communicating to a local web server. I argued above that it might have been better implemented using Java or .NET.

The committee has not decided the best approach. Greg (at 1:22:50) said:

[We haven’t figured out whether it makes sense to directly use parts of ACBLscore+ or just use the code a blueprint to build something new.

That [the Personal Web Server model] is actually something we haven’t all agreed on… My opinion is that it can be made viable if you are willing to tell clubs that are running laptops from 10 years that it is time to go spend $400 on a new computer at Best Buy.

Most of the computers I run into in ACBL-land run XP.

APIs in our future

The committee is in strong agreement about opening up ACBL data to third parties. I think there have been forces inside management pushing for this for years. But the addition of the outside experts to the committee seems to have strengthened their hand considerably. The committee is having serious discussions about Application Programming Interfaces (APIs), the technical means by which such access is given. Here are some quotes.

Jay Whipple’s response is to an audience member’s who said, “I really really like Jay’s Fast Results… I find it personally superior to ACBL Live… I would’ve liked to see ACBL save that money and work it out with Jay instead of spending that money in-house.”

Jay Whipple (at 1:45:18):

Fast Results has been a prototype for two and a half year and the truth is that I don’t want to do the production any more. ACBL is far more qualified to do this on a production basis. What I enjoy doing is developing new and creative things… I don’t want to run production. ACBL should run production on Fast Results… It’s also important to tap into the creativity of our membership and allow them access to information… that’s the kind of vision and opportunity we need to enhance and provide our members the opportunity to build creative things. There’s a lot of creative talented people out there who if given access to this information would be able to help and develop and prototype new things that we haven't even thought of before. We will not be able to develop all the future applications ourselves and I don't think we have a monolithic vision that the ACBL is going to do that.

CEO Hartman followed up (at 1:47:13):

We are in agreement with Jay on that and that’s one thing that’s come out of the last few months is this coming together of ideas. I thought it would a lot tougher to get everyone on the same page but as we talk through these ideas there has been unanimity on the way to proceed.

Bruce Knoll continued (at 1:47:45):

Internally we've been working since I've been on board to put ourselves in a position to start partnering with organizations and over the last 12 months we’ve been doing that in some areas.

Greg Humphreys said (at 1:48:49):

The ACBL, and Bruce [Knoll] in particular, has done a bunch of things recently about opening up their data to third party developers through normal APIs. We haven’t integrated into the [NABC] entry sales stuff [on Bridge Winners] yet but we will be able to do things like pull down player names and information and stuff just from the number and validate it and check if you are eligible for the Platinum Pairs before you buy an entry for it. That has been really encouraging and I hope that continues especially as you start to do things like getting results into a large database.

Being able to mine that database for information will be unbelievably valuable in ways that people don’t even really realize yet. Fast Results and ACBL Live, I hope that whenever all the dust settles and we have a new thing and can finally take ACBLscore out in the back and shoot it, we’ll have an environment where you can say, “I hate ACBL Live, I can do better”, that you can just do better. You can just build your own thing and because everything we built is in appropriate, open, and modular way, you can build a competitor to ACBL Live and if it is really better, people will just use it. And it will be up to you to develop something similar enough that people will not freak out because it’s so different but better enough that people will just naturally gravitate towards it and ACBL Live would just die.

Competition is good. If the software is done properly you can actually have micro-competition for little individual modules. If you don’t like the way your [the ACBL] thing is texting result to me, write your own. We would definitely like to make that available, but again creating an architecture where that sort of thing is possible is really hard and the right person needs to be directing that large architectural effort, not the pieces but that big vision of how do we build software so that can happy, that’s not easy. If you don’t get it right from the beginning it’s hard to extradite yourself from the mess.

Jay Whipple said (at 1:54:34):

We are done with ACBLscore. We are going to maintain it so that we can exist but we are not investing any more money in ACBLscore and we are moving forward and we would like that transition to be as transparent to the users as possible, ideally they don’t even know. Externally, they will see new access to that information. They see in on their phones, their computers, on the wall [via a projector]… we also want to tap into the creativity of our existing player universe to we can develop more creative applications.

I really hope all these good intentions come to pass. I strongly feel that good technology and a technology ecosystem built around it, are a necessary though not sufficient condition for bridge to thrive in the 20–40 year timeframe. I’ll keep pushing hard on these matters. I expect to be around to see this future.

In order to enhance data collection in the short term before ACBLscore is retired, Russ Jones mentioned plans to roll all ACBLscore game files into the Monthly Masterpoint Reports. I pointed out this possibility on Bridge Winners (here and here) last August, though I will not take credit for the idea because it is the obvious short term fix to anyone who knows how their data pipeline works. Also mentioned was the possibility of having the ACBLscore DBADD command transmit the game file directly to the ACBL. This would more timely than including the game files with the once per month masterpoint reports.

Lack of introspection about the ACBLscore+ debacle

Although openness in the form of data access and concrete APIs appears to be in our future, there is decidedly less openness about exploring the ACBL’s seemingly not insignificant contribution to the ACBLscore+ debacle. I’ll keeping hammering the ACBL on this because I feel that failing to come clean unjustly injures the reputation of Hammond Software, increases the chances of similar mistakes being made in the future, represents a lack of transparency, and disrespects the ACBL membership.

Here are some of the comments related to accountability.

Jay Whipple said (at 1:14:15):

If you don’t think this [ACBLscore+ situation] wasn’t painful to the board, we haven’t communicated well. This was a very painful mistake. We have learned a lot from it. What you see here is the start of our moving forward. We have two different committees. We have the Technology Advisory Committee, which you have in front of you, and the board has put together a board Technology Oversight Committee and we are very very focussed on making sure this does not happen again.

CEO Hartman followed up immediately (at 1:14:44):

I can’t tell you the countless nights that I’ve stayed up. I mean this just runs over my head and as I’ve explained to the board, the biggest mistake I made was not pulling the plug earlier. I mean that’s what I should have done. There were enough signs to see that the plug should have been pulled earlier and based on subsequent conversations with the developer I decided to not lose our investment and try to keep the program alive and keep throwing additional funds at it to make it work and in retrospect that was a big mistake.

Kevin Lane in the audience commented (at 1:16:01):

I hear with regard to the ACBLscore problems in January that some specific masterpoint calculations still aren’t known and of course it is discussed about the Swiss matching algorithm… if he [the vendor] didn't get specifications, which is something I said three years that needed to define, what he is supposed to do? If he didn’t get specifications, it is really hard to do that [complete the project].

Uday Ivatury responded (at 1:18:00):

I just want to address the specifications work. I think it is going to be a cold day in hell before we ever spec the classic thing [ACBLscore] out perfectly. It’s just so old and complicated.

Greg Humpheys follow up with (at 1:18:19):

It’s too big and it’s too complicated and it’s too sprawling… Listen ACBLscore has done amazing, I mean it has lasted 30 years which is extraordinary for software so ACBLscore deserves a standing ovation; however it has been incrementally developed by one or two individuals over 30 years and I’ve looked at the source and I can’t unsee it now, it’s rough. We’re not really going to get a full spec for what it can do which is actually perhaps a blessing. I mean you don’t exactly want to replicate all the warts of what is going on in ACBLscore. What you do want is more of a clean room spec. What do we actually want the program to do? How should it work, rather than figuring how it does work.

Did you know for example that ACBLscore can score duplicate spades games? Yeah it can. These are the things I’ve learned. I can’t unknow these things now… Even coming up with a full spec for everything ACBLscore should do sounds like a daunting task. But specifying how a piece of software should run a knockout does not sound so bad to me, especially because we have something that does it pretty well now and we can look to it for guidance-which is [ACBL]score+. It does a good job so we should absolutely look to it [ACBLscore+] to see how it should be done.

Jay Whipple forgets the lack of communication when the ACBLscore+ cancellation was announced and the biased communication thereafter, and ignores the fact that the Technology Advisory Committee exists largely in response to outrage from the ACBL community and specifically the letter prepared by Greg Humphreys and signed by 30 or so ACBL members with software expertise. He doesn’t say whether the board feels any responsibility for the situation. I don’t mean these criticisms personally, rather just with regard to his description of the board’s feeling as a whole.

Mr. Hartman blames himself for not stopping the project earlier but doesn’t address management’s role as a whole in getting to this point.

The discussion about specifications is interesting because it reveals a changing and better approach. I think Uday and Greg are right. Too much time was spent replicating ACBLscore functionality without reexamining what should be done. I think a detailed record would paint ACBL technical management in a negative light here.

I still want ACBL management to come clean. I am not calling for anyone’s resignation or termination, certainly not without more facts. Here is an incomplete list of issues where I believe ACBL management bears significant responsibility.

  1. Backdated payments to Hammond Software

    Hammond Software asserts payments were backdated (by months). This is dangerous and libelous assertion to make if it were false since it could so easily be demonstrated as false. Unless the contractor left check(s) uncashed for months to embarrass the ACBL. But I don’t know many businesses that would take the risk. Big checks are usually cashed immediately.

  2. ACBL did not deliver various specifications.

    Hammond Software asserts that specifications related masterpoint calculations and eligibility, Swiss team pairing, and the manual editing of movements (EDMOV) were not delivered when due. This sounds boring and technical—and it is—but software is mostly boring and technical. Failings like this have real consequence for schedules.

    I basically agree with Uday’s “cold day in hell” statement. Still, there is no evidence that the ACBL ever said to the contractor, “just do something sensible (and we’ll improve it later) because we cannot provide a spec.”

  3. ACBL reversed course on Tournament Finance Reporting (TFR).

    Hammond software asserts that the ACBL reverse course on TFR after most of the code was written because they decided that tournament directors were too old to learn Excel, a Microsoft product that is now 28 years old.

  4. Insistence on replicating exact ACBLscore calculations.

    Some calculations might vary by .01 (masterpoints, percent, etc) in ACBLscore+ and ACBLscore due to limitations in the way ACBLscore handles arithmetic. A lot of time was spent trying to match ACBLscore results exactly. There is no evidence that anyone at the ACBL ever said, “a 0.01 masterpoint discrepancy doesn’t matter in the grand scheme of things, just get it working.”

  5. ACBLscore calculations did not always match ACBL’s specifications.

    Hammond Software asserts that the “error rate for the tournament primarily used for testing was about 10%, i.e. about 10% of game files had incorrect MPs based on the published specs.” This led to confusion over whether ACBLscore+ should calculate according to the specification or according to how ACBLscore calculates and extra work ultimately calculating it both ways.

  6. ACBL never really decided on the minimum hardware configuration and OS version that ACBLscore+ had to run on.

I want the ACBL to acknowledge these shortcomings or explain otherwise. Items #2, #4, and #5 are probably all related to “technical debt” on a 30 year old piece of software and an unwillingness to fully document that masterpoint award system (which is a separate issue from the software itself). It’s okay to acknowledge the impact of technical debt—really it is—but we need to hear the words or hear a real explanation of why all (or at least most) of the fault lies with Hammond Software.