IT Security Program Definitions
Authentication – A process to verify that someone is who they claim they are.
Authorization – A process to verify that a person, once authenticated, has been granted rights to do what they attempt to do.
Availability – The NIST Federal Information Processing Standards (FIPS) Publication Series 200, “Minimum Security Requirements for Federal Information and Information Systems”
defines availability as “(e)nsuring timely and reliable access to and use of information.”
Best practice – Ideas that represent the most efficient or prudent course of action based upon collective experience, including:
- Operational practices based on repeatable procedures that have proven themselves over time
- Generally agreed upon processes and policies that should be undertaken when purchasing and deploying information technology projects in order to decrease operational and financial risk
- Strategies derived from industry experts and their working groups who have, through experimentation, observation, and experience, discovered methods for design, development, and operation of information technology systems which increase the degree or chances of success and decrease cost and risk of those systems and their outcomes
- Risk-based policies and procedures that cost-effectively reduce information security risks throughout the life cycle of each information system in an information security program http://www.gao.gov/key_issues/leading_practices_information_technology_management/issue_summary
Breach (HIPAA Breach) – The HIPAA Breach Notification Rule, 45 CFR Sections 164.400-414, requires HIPAA covered entities and their business associates to provide notification following a breach of unsecured protected health information. Section 164.402 defines Breach as “the acquisition, access,
use, or disclosure of protected health information in a manner not permitted under subpart E of this part which compromises the security or privacy of the protected health information.
Breached Credential – A Credential which has been the subject of a HIPAA Section 164.402 Breach (see Breach).
Breached System – A System which has experienced a HIPAA Section 164.402 Breach (see Breach).
Business process (Business function) – a process or function carried out by a Unit and its Workforce in the course of executing its role assigned by the University, or which supports or facilitates such activity.
Business Unit – An administrative or operational entity within a Unit implementing or operating University business processes. These are typically directed by Business Administrators.
Campus Community – Participants in the University business processes including its Workforce, research partners, affiliates, Business Associates covered by HIPAA Business Associate Agreements, and those who, through the University’s business processes, have access to non-Public University Data in performance of their responsibilities or obtaining information or services from University Data or University computational resources.
Campus Information Technology Principals – Units providing campus or University-wide information technology services: specifically including ACCC, AITS, the University HIPAA Privacy and Security Officer, and the University of Illinois Hospital and Health Sciences System.
Cloud Computing Services – NIST in “The NIST Definition of Cloud Computing” (2011) http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf
states “Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, Servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment model.”
The five essential characteristics of cloud computing services are on-demand self-service provisioning of computing capabilities, broad network access to the service, resource pooling, rapid elasticity in provisioning capabilities, and automated and optimizing service measurement.
The three service models of cloud computing services are organized under a three tier architectural model: Software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). This model can further be extended with tiers for desktop as a service (DaaS), backend as a service (BaaS), and information technology management as a service (ITMaaS).
Cloud Storage – Cloud storage is a model of networked online storage in which data is stored on virtualized pools of storage which are generally hosted by third parties.
Compromise – an executed threat in an IT Security Incident upon a University Information System or University data such that unauthorized changes have occurred or are possible to its configuration, programs, or data. Those actual or possible changes have caused or may cause the System or the data accessed by it, or in the case of stored data, the data itself, to become insecure, such that intended University confidentiality, integrity, or availability needs and standards defined by this Program can no longer be assured.
Compromised System – a University Information System which has experienced or is believed to have experienced a compromise. A Compromised system can no longer be guaranteed to meet intended University confidentiality, integrity, or availability needs and standards defined by this Program. It is prohibited to connect a Compromised System to a University network, or to use the system for any University business Process. The system must be processed through the RC.P.5
Confidential data – private organizational information not intended to be disclosed outside the context of the University or organization responsible for that information.
Confidentiality – The NIST Federal Information Processing Standards (FIPS) Publication Series 200, “Minimum Security Requirements for Federal Information and Information Systems”
defines confidentiality as “(p)reserving authorized restrictions on information access and disclosure, including means for protecting personal privacy and proprietary information.”
Contingency operations – the bare minimum services a Unit needs to operate its core business processes.
Control effectiveness – the ability of existing control measures implemented to measurably or estimably reduce risk.
Core business process (core business function) – an essential process or function carried out by a Unit and its Workforce in the course of executing its role assigned by the University, and tabulated as a Core Business Process by the Unit under its Business Continuity Plan in the University’s UIReady (Kuali) Business Continuity Planning tool.
Covered Entity – The University of Illinois, as defined by the Board of Trustees of the University of Illinois in “Designate the University of Illinois Health Insurance Portability and Accountability Act Hybrid Entity and Adopt Health Insurance Portability and Accountability Act Privacy and Security Compliance Policy” dated November 14, 2013:
Credential (Login credentials) – Login credentials to a System or to utilize a network address usually consist of a User ID and password. The login process may also use Identification through a PKI certificate, and an Authentication portion of the login process may use Tokens, biometrics, or a set of personal questions that the User must answer.
Critical business process – Unit University core business processes identified by a Unit handling PHI to specifically meet the HIPAA Security Regulation Emergency Mode Operation Plan requirements reviewed in Policy DP.4 Emergency Mode Operations Plan.
Critical data – information which, if damaged or destroyed, would cause considerable inconvenience and/or require replacement or recreation at significant cost.
Critical Levels – There are four levels of criticality in a Disaster Recovery scenario and they have been defined for the University of Illinois enterprise-wide by the Kuali Ready Business Continuity System (https://us.ready.kuali.org). They are as follows:
- Critical 1: Cannot pause and must be continued at normal or increased service load. The Recovery Time Objective (RTO) or the maximum amount of time that activities must be recovered and operational is within 0 to 8 hours. (Examples: inpatient care, police services, network/IT, animal care)
- Critical 2: Must be continued if at all possible, perhaps in reduced mode. Pausing completely will have grave consequences. The RTO is within 8 to 24 hours. (Examples: provision of care to at risk outpatients, functioning of data networks, at risk research)
- Critical 3: May continue in reduced mode or pause if forced to do so, but must resume in 96 hours or sooner. The RTO is within 24 to 96 hours. (Examples: classroom instruction, research, payroll, student advising)
- Critical 4: May be paused or be deferred, then resumed when conditions permit. The RTO is > 1 week, but < 30 days. (Examples: elective surgery, routine building maintenance, HR-training)
Data Custodian – a person with a role responsible for providing and supporting elements of an infrastructure in support of access to University Data and its transmission, receipt, and storage, assuring its confidentiality while providing for its availability according to diverse Unit business process needs, and ensuring its integrity. The Data Custodian may also provide and support secure access to computational resources utilized by Data Users, the Workforce, and the Campus Community, including, but not limited to, providing physical security, backup and recovery processes, granting access privileges to system Users as authorized by Data Stewards, and implementing and administering controls over that data.
Data Steward – the individual (or possibly, a group of individuals) who has a role with direct operational-level responsibility for the acquisition, management, and preservation of University Data – usually Unit heads or directors. The Data Steward may be the person responsible for the original collection or aggregation of the data, for example, a Principal Investigator whose study collects ePHI from subjects. As another example, the Data Steward may be the assigned University business process owner, e.g. a department’s Director of Graduate Admissions who supervises the collection of departmental supplemental graduate applications.
Data User – an individual who uses University Data as part of their assigned duties or in fulfillment of their assigned roles or functions carrying out University business processes within the University community
DE-CENT Computing Environment – Units in which each Workforce member may store University Data on their own, or each subordinate Unit and its Workforce may individually chose how to store University Data. In this type of environment, Workforce awareness of what computing equipment exists in their Unit or what type of data is stored may be limited.
Delegate – Staff assuming Program responsibility or implementation duties.
Delegation Agreement – A written document stipulating the terms of the delegation of a duty under this Program, defining what duty is delegated, to whom it is delegated, and the duties still required of the delegating Unit.
Emergency Mode Operations Plan – a plan that enables the continuation of critical business processes as identified by the Unit under the requirements of Policy DP.4 Emergency Mode Operations Plan, developed under the Unit’s DP.G.4 Emergency Mode Operations Plan.
Encryption – the process of encoding messages or information through a cryptographic algorithm so that only parties having authorized access to the encryption keys can decrypt and read the original data.
Endorsed campus solution (“endorsed”) – A product or service approved by Campus Information Technology Principals or the University Office of Business and Financial Services for purchase or use by the Campus Community, or by a specific subset of those individuals or units. For example, under site licensing agreements, one endorsed encryption product may only be available for purchase from the University by eligible faculty and staff, and a differing endorsed encryption product may only be available for purchase from the University by eligible students.
EPHI (ePHI) – Electronically protected health information as defined in the Health Insurance Portability and Accountability Act of 1996 (HIPAA), Title II Administrative Simplification, Subpart A: General Provisions, section 160.103. Quoting from it,
PHI (Protected health information) means individually identifiable health information.
EPHI is PHI transmitted by electronic media, maintained in electronic media, or transmitted or maintained in any other form or media. EPHI includes the following types of health information:
- All geographic subdivisions smaller than a State, including street address, city, county, precinct, zip code, and their equivalent geocodes, except for the initial three digits of a zip code if, according to the current publicly available data from the Bureau of the Census the geographic unit formed by combining all zip codes with the same three initial digits contains less than 20,000 people
- All elements of dates (except year) for dates directly related to an individual, including birth date, admission date, discharge date, date of death; and all ages over 89 and all elements of dates (including year) indicative of such age, except that such ages and elements may be aggregated into a single category of age 90 or older
- Telephone numbers
- Fax numbers
- Electronic mail addresses
- Social security numbers
- Medical record numbers
- Health plan beneficiary numbers
- Account numbers
- Certificate/license numbers
- Vehicle identifiers and serial numbers, including license plate numbers
- Device identifiers and serial numbers
- Web Universal Resource Locators (URLs)
- Internet Protocol (IP) address numbers
- Biometric identifiers, including finger and voice prints
- Full face photographic images and any comparable images; and
- Any other unique identifying number, characteristic, or code that is derived from or related to information about the individual.
Full backup – A backup of all data and configurations of a system including metadata describing the system as well as its installed programs and the operating system.
HIGH-CENT Computing Environment – Highly centralized and controlled computing environment in which a Unit knows fairly exactly what computing equipment they have, what data they have, and in addition they maintain a high level of security in their environment so that may ensure that the Unit University Data cannot be stored in places the data custodians and data stewards are not aware of.
High Risk Data – Information assets for which there are legal requirements for preventing disclosure or financial penalties for disclosure. Data covered by federal and state legislation, such as the federal Health Insurance Portability and Accountability Act (HIPAA) or the Illinois Personal Information Protection Act (IL PIPA), are in this class. Payroll, personnel, and financial information are also in this class because of privacy requirements. The Program recognizes that other data including Confidential Data may need to be treated as High Risk Data because it would cause severe damage to the University if its unauthorized disclosure or modification occurs through a compromise. The Data Steward should make this determination. It is the Data Steward‘s responsibility to request the Data Custodian implement the necessary security requirements for High Risk Data under this Program.
For a fuller discussion, please see the University of Illinois Information Security Policy:
HIPAA – The Health Insurance Portability and Accountability Act of 1996 (HIPAA)
subject to later modification by the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009 as part of the American Recovery and Reinvestment Act (ARRA) of 2009
and revision under the 2013 Final Omnibus Rule Update
Impact (in a risk context) – the financial or reputational harm that would be incurred by affected parties including the University if an adverse event occurred.
Impact level – Three impact levels are defined: Low, Moderate, and High.
The potential impact is LOW if the loss of confidentiality, integrity, or availability could be expected to have a limited adverse effect on University business processes, University assets including University Information Assets, or individuals. A limited adverse effect means that, for example, the loss of confidentiality, integrity, or availability might (i) cause a degradation in business process capability to an extent and duration that the organization is able to perform its primary functions, but the effectiveness of the functions is noticeably reduced; (ii) result in minor damage to organizational assets; (iii) result in minor financial loss; or (iv) result in minor harm to individuals.
The potential impact is MODERATE if the loss of confidentiality, integrity, or availability could be expected to have a serious adverse effect on University business processes, including University Information Assets, or individuals. A serious adverse effect means that, for example, the loss of confidentiality, integrity, or availability might (i) cause a significant degradation in business process capability to an extent and duration that the organization is able to perform its primary functions, but the effectiveness of the functions is significantly reduced; (ii) result in significant damage to organizational assets; (iii) result in significant financial loss; or (iv) result in significant harm to individuals that does not involve loss of life or serious life threatening injuries.
The potential impact is HIGH if the loss of confidentiality, integrity, or availability could be expected to have a severe or catastrophic adverse effect on University business processes, including University Information Assets, or individuals. A severe or catastrophic adverse effect means that, for example, the loss of confidentiality, integrity, or availability might (i) cause a severe degradation in or loss of business process capability to an extent and duration that the organization is not able to perform one or more of its primary functions; (ii) result in major damage to organizational assets; (iii) result in major financial loss; or (iv) result in severe or catastrophic harm to individuals involving loss of life or serious life threatening injuries.
Harm to individuals as described in these impact levels is easier to understand with examples. A compromise of the confidentiality of PII at the low impact level would not cause harm greater than inconvenience, such as changing a telephone number. The types of harm that could be caused by a compromise involving PII at the moderate impact level include financial loss due to identity theft or denial of benefits, public humiliation, discrimination, and the potential for blackmail. Harm at the high impact level involves serious physical, social, or financial harm, resulting in potential loss of life, loss of livelihood, or inappropriate physical detention.
Incremental backup – A backup of only files which have changed since the last full backup.
Information Asset – information, software, services, intangibles (e.g. reputation), and physical assets (devices or components) used to access, store, or process information assets.
Instructional Systems – Workstations dedicated for the use of instructional software or to display presentations of educational material which 1) do not have any High Risk Data, Sensitive Data Collections, or Sensitive Data stored on them and 2) and which are not configured to have access to such data on file shares.
Integrity – The NIST Federal Information Processing Standards (FIPS) Publication Series 200, “Minimum Security Requirements for Federal Information and Information Systems”
defines integrity as “(g)uarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity.”
Internal Data – information that, if disclosed or modified without authorization, would have moderate adverse effect on the operations, assets, or reputation of the University, or the University’s obligations concerning information privacy.
IT Security Incident – The NIST Federal Information Processing Standards (FIPS) Publication Series 200, “Minimum Security Requirements for Federal Information and Information Systems”
defines an information technology (IT) security incident as “(a)n occurrence that actually or potentially jeopardizes the confidentiality, integrity, or availability of an information system or the information the system processes, stores, or transmits, or that constitutes a violation or imminent threat of violation of security policies, security procedures, or acceptable use policies.”
Some of the ways an IT Security Incident at this University may affect:
- Confidentiality is by unauthorized disclosure of University Data;
- Integrity is by unauthorized change or deletion of University Data;
- Availability is by inhibition of access to or control of computer or network based resources used in University business processes.
An IT Security Incident may occur
- Intentionally, for example in the course of a malefactor’s violation of a statute or regulatory requirement, or
- Unintentionally, for example, in the course of a natural disaster.
IT Security Incident threats arise or have the potential to arise when vulnerabilities can be exploited.
Likelihood – the perceived likelihood of an adverse event occurring over a certain time range, typically one year.
Low strength encryption (“low strength”) – encryption cipher strength less than Medium strength encryption.
Medium strength encryption (“Medium strength”) – minimum 128 bit AES, IDEA or RC4 encryption level or 168 bit (effectively 112 bit) 3DES encryption ciphers.
MIXED Computing Environment – A mixture of varying levels of awareness of Unit University Data distribution, and of the effectiveness of administrative control of the systems and data storage.
Non-Secure Systems – Workforce member’s personal computers not administered and secured by Unit staff under Policy SS.2 Establish Workstation and Server Access Controls, but utilized to carry out University business processes or to access University data
Offsite backup – Backup media physically stored elsewhere besides a Unit’s assigned space or the campus building in which that space is located.
Originator Usage Period – applicable to the use of a unique Symmetric Data Encryption key in applying the original cryptographic protection to information (i.e., encrypting storage under DCS.S.4.1.2, Data Encryption Storage Standards). During the originator-usage period, information may be encrypted by the data-encryption key; the key shall not be used for performing an encryption operation on information beyond this period. However, the key may need to be available to decrypt the protected data beyond the originator-usage period (i.e., the recipient-usage period may need to extend beyond the originator-usage period).
Personally Identifiable Information (PII) – NIST SP 800-122, Guide to Protecting the Confidentiality of Personally Identifiable Information (PII)
defines PII Data as any information about an individual maintained by a Unit, including (1) any information that can be used to distinguish or trace an individual‘s identity, such as name, social security number, date and place of birth, mother‘s maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.
At a minimum, Personally Identifiable Information (PII) must be treated as Internal Data, and elements of PII may be classified as Sensitive, Confidential, or High Risk Data. This definition, however, does not supersede University policy on FERPA data.
Examples of PII Data include, but are not limited to:
- Name, such as full name, maiden name, mother‘s maiden name, or alias
- Personal identification number, such as social security number (SSN), passport number, driver‘s license number, taxpayer identification number, patient identification number, and financial account or credit card number21
- Address information, such as street address or email address
- Asset information, such as Internet Protocol (IP) or Media Access Control (MAC) address or other host-specific persistent static identifier that consistently links to a particular person or small, well-defined group of people
- Telephone numbers, including mobile, business, and personal numbers
- Personal characteristics, including photographic image (especially of face or other distinguishing characteristic), x-rays, fingerprints, or other biometric image or template data (e.g., retina scan, voice signature, facial geometry)
- Information identifying personally owned property, such as vehicle registration number or title number and related information
- Information about an individual that is linked or linkable to one of the above (e.g., date of birth, place of birth, race, religion, weight, activities, geographical indicators, employment information, medical information, education information, financial information).
PHI (Protected health information) – individually identifiable health information. See ePHI
PIPA – Illinois Public Act 094-0036 ”Personal Information Protection Act”, Illinois Compiled Statutes Chapter 815, Act 530
and revised under Senate Bill 1833, passed by both Houses 5/31/15:
Portable data storage media (portable media) – magnetic, optical, or other data storage technology media intended to be removed from devices storing or reading information on the media, thereby portable media: typically CDs, DVDs, or tapes.
Portable data storage device – any physical computing device capable of data storage designed to be readily picked up and transported by persons from workplace to workplace, as opposed to “desktop computers” or other bulkier and less transportable systems such as servers generally meant to be utilized at a relatively permanent and single fixed physical location. Portable data storage devices include tablet, laptop, or notebook computers, and also handheld devices such as PDAs, USB interface storage devices including “thumb drives” and “flash drives”, and all types of portable data storage media.
Principals – see Campus Information Technology Security Principals
Private Data – Data that is to be observed only by the originator or sender(s) and a designated recipient or receiver(s).
Program – this Information Technology Security Program.
Program Components – this Information Technology Security Program’s Information Technology Security Policies, Procedures, Forms (Worksheets), Guidelines and Standards.
Public Access Systems – Systems identified by the Guideline DCS.G.2.4 Conduct Risk Assessment and evaluated under the DCS.G.2.5 Implement Risk Management Program process which are designed to allow unauthenticated access. These systems are specifically exempted from the Policy SS.5 Automatic Locking requirements for non-privileged login accounts which provide unauthenticated access services. As an example, a unit may define publicly accessible information kiosks (where the public may input information queries) or ACCC administered Pharos print stations as Public Access Systems if they are identified and evaluated as Public Access Systems in the above Guidelines.
Public Data – Public Data is University Data intended for public use that, when used as intended, would have no adverse effect on the operations, assets, or reputation of the University, or the University’s obligations concerning information privacy.
Recipient Usage Period – applicable to the use of a unique Symmetric Data Encryption key in decrypting information (i.e., encrypted storage under DCS.S.4.1.2, Data Encryption Storage Standards). During the Recipient Usage Period, information may be decrypted by the data-encryption key. The Recipient Usage Period should not exceed the Originator Usage Period plus three years. Thus, a unique Symmetric Data Encryption key under the DCS.S.4.1.2 Standard shall not be used for a period exceeding 5 years from the date the information has been encrypted.
Reporter – Any Unit staff member who identifies a potential or actual conflict in responsibility or implementation of any Program requirement.
RPO – Recovery Point Objective. The maximum tolerable period in which data might be lost from an information technology service due to a major incident.
RTO – Recovery Time Objective. The duration of time within which a business process must be restored after a disaster in order to avoid unacceptable consequences associated with a break in business continuity.
Sensitive Data – Information that, if disclosed or modified without authorization, would have serious adverse effect on the operations, assets, or reputation of the University, or the University’s obligations concerning information privacy. Sensitive Data includes information that is covered by FERPA, Non-Disclosure Agreements (NDAs), and other intellectual property are, as a minimum, in this class.
Note: Data described in Non-Disclosure Agreements may fall into the High Risk Data or Sensitive Data categories, and should be individually evaluated.
Sensitive Data Collection – a collection of Sensitive Data that results from compiling (i.e., collecting) the Sensitive Data from multiple sources. For example, an instructor’s compilation of grades for his/her own classes, held on their own computer, would not be a Sensitive Data Collection. However, a department’s compilation of all the grades for all the classes in the department, held in storage, would be a Sensitive Data Collection.
Where a requirement is given in this Program for Sensitive Data, the same requirements apply to Sensitive Data Collections as a minimum threshold. Sensitive Data Collections are specifically identified in this program where a more restrictive or extensive requirement is applied to a Sensitive Data Collection than Sensitive Data.
Server – any system, application, or data storage device which is configured to allow access to it by a more than one User (excluding a single User per device and the system administrator, if separate from the User) concurrently.
Strong encryption (“Strong” or “High strength”) – ciphers with an encryption level at least as strong as AES (192, and 256-bit key lengths), Blowfish (128 to 448-bit key lengths, in 8-bit increments), and ARCFOUR (2048-bit).
System – Any computer or networking resource such as a Workstation, Server, or router.
Threat – The NIST Federal Information Processing Standards (FIPS) Publication Series 200, “Minimum Security Requirements for Federal Information and Information Systems”
defines a threat as “(a)ny circumstance or event with the potential to adversely impact organizational operations (including mission, functions, image, or reputation), organizational assets, or individuals through an information system via unauthorized access, destruction, disclosure, modification of information, and/or denial of service. Also, the potential for a threat-source to successfully exploit a particular information system vulnerability.”
UISO – Unit Information Security Officer, typically at the college or upper administrative Unit level.
UISOO – Unit Information Systems Operations Officer. These are staff assigned system administration or operations duties, or managers of such staff (IT Directors for example).
Unit – an entity of administrative organization at the University of Illinois. The term is defined respectively by the University under its Statutes, and by its Office of Business and Financial Services:
- The University of Illinois Statutes, as amended January 24, 2013
http://www.uillinois.edu/trustees/statutes.cfm describes Units at:
Article III. Campuses, Colleges, and Similar Units
Section 1. The Campus
- The campus is the largest educational and administrative group. It is composed of colleges, schools, institutes, and other educational units in conjunction with administrative and service organizations.
Section 4. The School and Similar Campus Units
- In addition to colleges and departments, there may be other units of a campus, such as a school, institute, center, hospital, and laboratory, of an intermediate character designed to meet particular needs.
Article VIII. Changes in Academic Organization
Section 1. Definitions
- Unit. For the purposes of Article VIII, a unit is a division of the University to which academic appointments can be made and to which resources can be allocated, including departments or similar units, centers, institutes, schools, and colleges.
- The University of Illinois Office of Business and Financial Services (OBFS) defines a functional reporting Unit (such as a school, college, or department) with discrete financial activities as an organization, and utilizes a segment of the C-FOAPAL accounting string to identify Banner chart/org combinations that define university business units.
The OBFS Banner Alerts and Resources at http://www.obfs.uillinois.edu/banner-alerts/ give instructions on accessing the Banner FIFGRORGH (Organization Hierarchy) report of organizations (functional reporting Units with discrete financial activities) at the University.
University business process – see business process: a process or function carried out by a Unit and its Workforce in the course of executing its role assigned by the University, or which supports or facilitates such activity.
University Data – A type of University Information Asset: data created or acquired by the University and its Workforce in the course of planning or carrying out University business processes (Business functions).
University Information Asset – all Information Assets with value to the University as defined by one or more of the following criteria:
1) by their usage in University business processes as identified in ,
2) by the reputational risk to the University if the availability, integrity, or confidentiality of the University Information Asset is compromised,
3) or resources consistent with the definition of the University’s data in the University’s Office of Business and Financial Services’ Business and Financial Policies and Procedures, Section 19.5 “Information Security Policy – The University of Illinois”
Vulnerability – The NIST Federal Information Processing Standards (FIPS) Publication Series 200, “Minimum Security Requirements for Federal Information and Information Systems”
defines a vulnerability as “(a) weakness in an information system, system security procedures, internal controls, or implementation that could be exploited or triggered by a threat source.”
Whole/Full Disk Encryption (FDE) – the process of encrypting all the data on a physical hard drive, or aggregation of physical hard drives through RAID technology.
Workforce – All University employees with appointments in a specific Unit, and also employees of external companies engaged in University contracts to carry out University business processes utilizing University Data for a Unit.
Workstation – any computer hardware which is implemented to provide User access to University Information Assets including systems, applications, or data regardless of the technology by which the data is stored, transmitted to, or accessed by that Workstation.
Workstations include traditional terminals and notebook, laptop, portable and desktop personal computers as well as tablet computers, personal digital assistants, and mobile / wireless / broadband telephone or other such network access devices.
How a service became approved was never specified.
Moved to ‘Cloud Services’
Added new def to be cited at the beginning of SPO
Added Breach definition and Breached credential and system below
The term “business process” appeared often in the Program but was never defined. Added this def. and University business process definition
Added new def, put in terms of AIC (avail, integ…)
Formerly the defined term was Compromise (Breach). Created separate definition for Breach as it is a HIPAA defined term
AS for Compromise, removed ()Breached). Added hyperlink.
Along with the Compromise and Breach def. Changes/addition, added this def
Cited the source for the levels and their values