Meetings
 
 
Print
AGENDA ITEM REPORT

Title: Report on POST's Online Testing Systems
REPORT PROFILE
MEETING DATE
2/25/2016
BUREAU SUBMITTING THIS REPORT
Computer Services Bureau
RESEARCHED BY (PRINT NAME)
Colin O'Keefe
REVIEWED BY (PRINT NAME)
Dave Cornejo
REPORT DATE
01/22/2016
APPROVED BY
Robert A. Stresak
DATE APPROVED
01/27/16
PURPOSE
Information Only
FINANCIAL IMPACT
No

ISSUE:
After extensive market analysis and efforts to procure a replacement testing system, no suitable replacement for POST’s Testing Management Assessment System (TMAS) could be identified to meet POST’s requirements within budget and staffing constraints.  POST has determined that incrementally improving TMAS, rather than complete system replacement, is the most viable and cost-effective path forward for statewide computer-based testing.
BACKGROUND:

POST’s current TMAS system is a vendor-provided solution, implemented in 2006 by Crown Pointe Technology that met the following objectives:

  • Provide a secure computer-based test administration, delivery, and scoring system, while preventing unauthorized access to testing materials.
  • Support POST management and maintenance of the Basic Course training and testing materials.
  • Support test development item banking, student record management, test scoring, and reporting requirements.
  • Provide testing materials that ensure consistent standards for peace officer knowledge and ability statewide.
  • Retain historical records of test structure, contents, and results for statistical reporting and analysis.

Background and History:

POST’s current computer-based system (TMAS) delivers online tests to approximately 4,500 - 6,000 police trainees at law enforcement academies throughout California each year.  TMAS has overall been functional and cost-effective; however, its security architecture allowed unauthorized staff to view testing materials for courses that were not authorized for use at their sites.  In addition, in a number of cases, the system’s printing capabilities have been misused at testing sites to print unauthorized test materials.  Therefore, POST determined that replacement of the system would likely result in improved testing security, and deliver a more positive outcome for law enforcement testing.

A Feasibility Study Report (FSR) was approved by State control agencies in July 2011, and the POST Testing System Replacement (TSR) project was initiated in FY 2012-13, after several significant security lapses at law enforcement training academies throughout the state.  These lapses brought to light weaknesses in the TMAS security model, which allowed administrative staff at testing sites to obtain test questions for tests that had not yet been delivered to students.  Several sites took advantage of this design to distribute POST test content as “study guides,” in violation of testing standards.

During the TSR project initiation (in FY 2013), the TMAS vendor stated that it lacked resources to accommodate POST’s request for system changes to support new tests and other training materials, as well as resolve the known security issues, under existing contract arrangements.  This factor contributed heavily to POST’s decision to replace the system.

The project assigned one full-time POST staff person to work with California State Technology Procurement Division (STPD) representatives and start development of an Invitation to Bid (IFB), targeted at bringing on board vendor resources to replace the current TMAS system with a new computer-based test delivery system and address the known system issues. 

Over the past two years, the TSR project missed several milestone dates for the completion and release of the IFB to the vendor community and experienced staff turnover both at the overseeing control agencies and within POST.  During the same period, the current TMAS system vendor was able to deliver several critical fixes to the current system.

ANALYSIS:

1.  Market Research Efforts:

  • Through the Request for Information (RFI) process conducted in FY 13-14 under STPD guidance, interested vendors provided responses to POST’s requirements for a replacement testing system.  All respondents to the RFI indicated that their Configurable Off The Shelf (COTS) products met some, but not all, POST testing requirements.
  • Many vendors could provide test delivery functions, or test development functions.  However, no vendors were identified who could provide a COTS system delivering comprehensive item development, testing, and reporting features, while also adhering to POST’s test design and development procedures. Two respondents stated they could customize their product to meet all of the POST’s requirements through the software development process; however, this work could not be performed and still meet POST’s project budget.  The remainder of respondents stated that they were unable to modify their COTS products to meet the test-design aspect of the requirements.
  • In response to all vendor feedback to the RFI, POST re-analyzed its requirements to determine if they could be broadened to allow a range of established testing vendors to compete in the replacement efforts.  After thorough review, staff determined that POST requirements were mandatory.
  • Conclusion:  There were no COTS products that met all of the TSR requirements, and the budget allocated to the project for system development was inadequate to provide customization of the gap.

2.  Staff and budget limitations:

  • One individual was hired into the Computer Services Bureau in 2012 to act as a full-time project manager.  Three (3) individuals from the Standards and Evaluation Bureau are involved in test development and support functions and were designated to provide assistance with requirements design while still performing their full-time support functions.  This allocation of staff proved inadequate to meet milestones and keep the project moving in a forward direction. The ongoing budget outlook will not allow for additional staffing to be added. The project experienced significant delays due to the turnover of both POST and Department of Technology Project Oversight staff.  Of note is that five different Department of Technology analysts have been assigned to assist with the project over its duration, and each had significantly different requirements regarding POST’s best course of action.
  • Conclusion: The project lacked the appropriate staffing, both in terms of expertise and availability, to meet project obligations.

3.  TMAS security fixes:

  • TMAS vendor Crown Pointe Technology initiated efforts with POST to patch the security flaws in the existing application.  The vendor implemented a much-improved security model addressing previous POST concerns with test security.  Specific changes include:
    • Automated checks to determine if academy administrative users have appropriate permission(s) to access test materials;
    • Elimination of on-site printing of test materials to reduce security breaches;
    • 100% computer-based delivery to proctored sessions at law enforcement academies; and 
    • Upgraded its technology platform to the most recent ASP.NET and SQL server versions.  The application’s system environment is now on par with other systems running in production statewide and helps resolve the known technology weaknesses of the current TMAS system.

Following the above analysis, staff determined that continued use of the currently-deployed TMAS system in its more secure form will allow POST to meet system requirements within available budget and staffing levels.

This project is supported by POST Strategic Plan Objective B.4.5 Procure a broad range of computer-based testing and automated scoring tools, and B.6.1 Continuously evaluate information technology security and implement upgrades where necessary.

RECOMMENDATION:
This report is presented for information only.  No action is required.
 
ATTACHMENT(S):
Name: Type:
No Attachments Available