ILS Requirements Task Force Reports to MCLS

From ILSTF

Jump to: navigation, search

8/25/14 Report to the MCLS Exec Committee

  1. We began meeting on 8/8/14, we have been meeting twice a week starting the next week.
  2. We are using the requirements section (E and F) from the Orbis Cascade RFP and editing to fit our needs.
  3. Everyone on the group has an assignment.
  4. We have been in contact with staff at Orbis Cascade, Al Cornish from the Systems office joined our call on Friday.
  5. We have found a few areas where the OC RFP needs extra work: things have changed since it was written, the EResource Management section needed to be enhanced; reports needed to be given its own section in the systems part and reworked throughout the entire requirements; the OC Consortium was very different from ours, especially their relationship/reliance on the central office.
  6. We will have a draft ready for the Sept. MCLS meeting, but it will be an early draft.
  7. We will need time after the meeting to create a document that holds together well. At that point we would like input from library staff, especially the committees.
  8. FLVC is working with the group on developing a timeline, we will be discussing the draft in today's meeting.
  9. Janice Henderson joined our call on Friday to give us some context and to thank us for our work so far. She also told us that if there is more work after the Sept. 3rd meeting this is the group that will be tasked to do it. So its not just a requirements Task Force.

9/4/14 Report to the MCLS Quarterly Meeting

9/4/14 Jean's Notes from the MCLS Quarterly Meeting Next-Gen discussion Next Gen Requirements-

  • shouldn’t there be musts in the doc? - assumed that this will be handled in the evaluation criteria
  • evaluation criteria need to be created before the ITN goes out
  • consultant helped turn the Discovery Tool questions into criteria. we can look at that as an example
  • will have to rank the criteria (this may be where the musts come in)
  • Timeline
    • Why the rush?
      • Taking into account the schedule for the legislative calendar, a placeholder has been made for funds but more specifics would be helpful in assuring approval
      • Assumption that once the funds are approved need to move quickly to spend them
    • After much discussion it was decided to amend the timeline to add more time, for reqt creation and for feedback, about a month. Lucy is working on revised version.
    • Also demos probably will take place after ALA, meaning February
    • Another assumption is that the legislators have an expectation of having a combined system (this is in the law and was mentioned in the OPAGA report)
    • should we wait a year until other systems are more mature? we can make a decision after negotiations that we will not be moving forward with a purchase

10/28/14 Report to the MCLS Exec Committee

  • Evaluation Process
    • Assign points to Sections with weighting for relative importance
Section Percentage
E. (Systems) 18%
F.1 (Collections and Resource Management 17%
F.2 (Description and Metadata) 16%
F.3 (Circulation and Resource Sharing) 17%
G.(Discovery and User Experience) 16%
H.(Joint Use Facilities) 6%
Price 5%
Business Ref/Vendor Capacity 5%
    • Write Criteria Statements for each subsection (underway)
    • Evaluation Team members evaluate responses based on not met/met/exemplary each subsection
    • Given that not all subsections are equal, Evaluation Team members individually assign section points based on judgement
    • Tally scores across the Evaluation Team
    • Meet and discuss and adjust the Evaluation Team score as needed
    • Selection of vendors invited for demos is based on highest scores (maximum 4)
    • After demos Team evaluates demos based on same sections, tallies numbers, and adds scores together for final scoring.
    • Evaluation Team picks x number of vendors to negotiate with.
    • Exact logistics for pricing is still being reviewed.
  • Discovery Tools
    • Discovery and User Experience will be weighted at 16%. If a vendor includes its own discovery tool in response to the ITN, the 16% will be determined based on the capabilities of that discovery tool. If the vendor does not include its own discovery tool, the 16% will be determined based on the capability of the ILS to work with outside discovery tools.
    • Different possible outcomes concerning Discovery Tools:
      • ILS vendor and DT from same vendor
      • ILS vendor subcontracts DT (or vice versa), but includes DT in bid
      • ILS vendor can work with various DTs and doesn't choose one
    • Options need to be reflected in instructions for Pricing; vendors should decouple their prices for the ILS and DTs. This will give us the maximum flexability for selection.
    • Task Force felt it is important to understand the options and make sure that the process supports the best outcome for our unique forty institutions, which is not known at this time
    • Is the Task Force correct that the primary focus is selection of a next-gen ILS? The ILS selection should be made separately from the discovery tool. If it is determined that there is no discovery tool that would meet the needs of FLVC libraries, is continuation with Mango (with the addition of a central index) an option? Secondly, if a new discovery tool is selected, but some libraries wished to continue with Mango would that be an option?
Personal tools