# CMM levels helps to improve software life cycle processes.
Level 5: optimizing (continuous improvement).
Level 4: managed (quantitative quality).
Level 3: defined (documented process).
Level 2: repeatable (disciplined management process).
Level 1: Initial (Adhoc, individual effort).
# CMMI (capability maturity model integration)
The purpose of CMMI is to integrate various software maturity models including CMM into a single model. Just like CMM, CMMI also has five maturity levels, but the description of each level is not similar to the CMM. CMMI levels:
Level 5- optimizing (focuses on processes)
Level 4-quantitatively (process is measured and controlled)
Level 3- defined (process is characterized and proactive)
Level 2- managed (process is characterized and reactive)
Level 1- initial (poorly controlled process, which can be unpredictable and reactive)
Business case - Gives the necessary information whether to start a project or not. It is developed from the result of feasibility study, which is done during the project-planning phase.
Software size estimation methods:
1) LOSC (lines of source codes)
2) Function point analysis - It considers the following parameters:
-> Number of user input -> Number of outputs -> Number of user inquiries -> Number of files -> Number of external interfaces
# Time box management
- This project management technique is used to deploy software project within a short and fixed time frame using fixed resources.
- It can be used with rapid application development type projects.
- Advantage: Preventing project cost overrun and delay.
- Project controlling activities: managing project scope, resources and risks.
# Project risks
1) The risks that impact Business benefit: project sponsors are responsible for mitigating this risk.
2) Project risks: project manager is responsible for project risks.
Project risk management process consists of five steps:
1) Inventory risks
2) Assess risks
3) Mitigation risks
4) Discovery risks
5) Review and evaluate
- The errors cause by the unauthorized access is the main problem with online programming methods.
Categories of program debug tools
1. Logic path monitors: identify the errors in program logics
2. Memory dumps: identify data inconsistency in data or parameters.
3. Output analyze: checks the accuracy of the results after execution
- The certification and accreditation process starts after successful completion of final acceptance test.
# Certification process - Assesses standard controls (operational, management, technical) in an information.it examines the level of compliance of policies, standards, guidelines, processes, procedures and guidelines. The goal of certification process is to determine whether the controls are operating correctly, producing expected outcome and meeting the security requirements. The outcome of the certification process helps to reassess and to update the system security plan.
# Accreditation - senior management’s decision which authorize IS operation and accept the risks (risks in IT assets, operation, individuals).
It is considered as a form of quality control, which challenges IS managers and staff to implement highly effective security controls in the organization’s IT systems.
# Changeover (cutover or go-live technique)
This is an approach to migrate the existing users of an old system to a newly developed system. It is also known as cutover since it cut out the users from the old system and move them to the new system.
- Parallel change over: The old system is kept running. Then running the new system, making both the new and old system running at the same time.in this approach the users use both
the system, and this helps to identify any problem that the uses face while using the new system. When users gain confidence in the new system, the full changeover to the new system take place.
- Phased change over: This approach break downs the old system into several deliverable modules. The first deliverable module of the old system is replaced with the first deliverable modules of the new system. Similarly, all the other new modules replace the old modules. Thus, the changeover to the new system takes place.
Risk: IT resource challenges, extended project life-cycle, running change management for the old system.
- Abrupt changeover: On a specific date and time, the old system is changed over to the new system and the use of old system is discontinued.
Risk: assets safeguard, data integrity, system effectiveness and efficiency.
- The main objective of a post implementation review is to assess and measure how much value the project has delivered to the business.
# EDI (electronic data interchange)
- Usually, EDI is used to transmit invoices, shipping orders and purchase orders.
- An EDI system requires the following components:
1. Communication software
2. Transaction software
3. Access to standards
When reviewing an EDI, an auditor should consider:
1. The proprietary version of EDI. Most of the large organization has its proprietary EDI.
2. Publicly available commercial EDI (this approach is less costly but has more security risks)
# Traditional EDI
1. Communication handler: A process handles data transmission in the dial-up lines or in other public networks.
2. EDI interface: it manages and control data path between the communication handlers and the application. The two components of EDI interface are EDI translator (converts data from EDI format to proprietary format) and application interface (used for data movement and data mapping).
3. Application system: A program that process data before sending and after receiving from the trading partners. Web based EDI- used for generic network access.
# EDI risks
1. Transaction authorization(the main risk in a ESD system)
2. Loss of business continuity
3. Deletion or manipulation of transactions
4. Duplicate EDI transmission and data loss.
5. Lose of transaction confidentiality
The IS auditor can verify the evaluation objective of EDI by reviewing the followings:
1. Encryption in place
2. The existence of checks for data editing
3. Validity and reasonability check for each transaction
4. Logging of each inbound transaction.
5. Verifying the number and value of transaction with control totals
6. Using segment count totals
7. Using transaction set count totals
8. Using batch control totals
9. Sender’s validity against other trading partners
Some other EDI audit options are:
1. Audit monitor: it is installed in the EDI computer to capture transactions so as an auditor can review it.
2. Expert systems: it is an audit monitor that can determine the significance of a transaction based on audit rules and prepare a report for the auditors.
# DSS (decision support system)
- DSS mainly focuses more on effectiveness and less on efficiency.
- Prototype is the preferred DSS development and design approach.
- The true evaluation of a DSS is whether it can improve management’s decision-making process.
# Data oriented system development:
A software development method where data and data structures are used to represent software requirements. Elimination of data transformation error is the major advantage of this method.
# Object oriented system development:
A programming technique, not a programming method, where data and the procedures are considered as an entity. It has the advantage of managing unrestricted types of data, modeling complex relationships, can adapt with changing environment.
# Component based development:
An extension of object oriented system development. In this technique, various applications are assembled to deliver their service with defined interfaces. The purpose of the interface here is to facilitate the application programs to communicate with each other regardless of their source language and operating platforms.
Advantages: short development time, programmers can focus more on business functionality of the application, promotes modularity, ability to combine cross languages and codes reusability, less development cost, allows to buy only the components, not a complete solution with features that are not required, that is necessary to the system.
# Application controls
1. Data input
2. Data processing
3. Output function
# Input controls
- Input control is assured by the followings:
A. Input authorization
B. Batch control and balancing
A. Input authorization:
- It ensures that all the data input are authorized and approved by the responsible department or management. Input authorization types:
1. Signature on batch forms or source documents
2. Online access controls
3. Unique passwords
4. Terminal or client work station identification
5. Source document
B. Batch control and balancing
- Batch balancing is all about making sure that each transaction creates files or documents, which are added to the batch, processed and accepted by the system.
Input control techniques
1. Transaction log
2. Reconciliation of data-whether all data received are properly recorded and processed.
3. Documentation
4. Error correction procedures
5. Anticipation
6. Transmitted log
7. Cancellation of source document
# Data processing controls and procedures
The processing controls consist of:
A. Data validation and editing procedures
B. Processing controls
C. Data file control procedures
A. Data validation and editing procedures:
- Input data need to be validated and edited as soon as they are generated. Data validation means finding out data errors, incomplete/missing data and inconsistency in data.
- Edit controls are used before data are processed in order to prevent inaccurate data processing.
-Sequence check
-Limit check
-Range check
-Validity check
-Reasonableness check
-Check digit - A number value is added to the original data to make sure that data has not been altered. It is used to detect transcription and transposition errors.
-Completeness check
B. Data processing controls:
- it ensures that data are complete and accurate. The data processing techniques are manual recalculation, editing and run-on totals, programmed controls, limit checks on amounts, reasonableness verification of calculated amounts, reconciliation of file totals and exception reports.
C. Data file control procedures:
- it helps to make sure that only the authorized processing are done on the data; it will not perform any unauthorized process on data. Content of the data tables or file can be divided into the following categories:
1. System control parameters: any changes in these data can change the way system functions.
2. Standing data: they are not frequently changed. Example: suppliers' names, addresses etc.
3. Master data/balance data: these are current balances and total balances, which are frequently updated by new transaction. Audit trails must be present to preventing changing these data.
4. Transaction logs: these logs are controlled by validation checks, exception report, control totals etc.
Important controls for data files:
Before and after image reporting - The data file before and after the processing need to be recorded to analyze the processing impact on the database.
Error reporting and handling - Those who input the data should not review and authorize the error correction.
Source file retention
Version usage - It is necessary for processing the correct version of the file because older version may not need to run all the procedures.
Data file security - Use to prevent unauthorized access to the data
One-for-one checking - It is used to make sure that all the documents are being processed.
Transaction logs - The activities that need to recorded are input time, username, input terminal/computer details etc. these activity data helps to generate audit trails and can be used to find out errors or warnings and to restore the system if any technical problem occurs.
Parity check - This is used to detect transmission errors in the data. When a parity check is applied to a single character, it is called vertical or column check. In addition, if a parity check is applied to all the data it is called vertical or row check. By using both types of parity check simultaneously can greatly increase the error detection possibility, which may not be possible when only one type of parity check is used.
# Output controls
- Output control ensures consistent and secure delivery of data. The data also need to be presented to the users in proper format. The output controls are:
1. Logging and storage of forms in secure place
2. Computer generated forms and signatures: all the computer-generated forms should be compared with the physical copy of the forms. One should be accountable for any issues, exceptions or unwanted modification of the forms.
3. Distribution of report: the report should be distributed to the person authorized to receive it.
4. Balancing & reconciling: procedures to find out errors in the output report should be established and it should be delivered to the concerned department for review and correction. 5. Output report retention: there should be report retention schedule and the report retention policy should follow legal regulation, if there is any.
6. Report receipt: the recipient of the reports should sign in the record or logbook. It will make sure that the sensitive reports are being distributed properly.
# Tasks of IS Auditor in application controls
1. Identifying the important applications, and its components. Understating the follow of information among the applications or systems and gaining knowledge about the application by reading available documents and interviewing IS personals.
2. Identifying the strength of application controls, and evaluating the impact of those identified weakness.
3. Understanding the functionality of the applications by reviewing the system documentation.
# Data integrity tests
Data integrity tests consist of a number of substantive tests. It aims to test the data accuracy, consistency and authorization.
1. Relational integrity: it is done at the data elements or record levels. Relational integrity can be maintained by building built-in data validation routines in the applications. It can also be implemented in the database by defining input constraints and data characteristics in the tables.
2. Referential integrity: it tests the relationship between entities in the tables in a database. Relational integrity helps to maintain the interrelationship integrity in relational database model (RDBMS).a relational database establish relation among various tables using references between primary and foreign keys. Referential integrity tests make sure that all these references exist in the original table.
# Data integrity in online transaction
- The integrity of online data is maintained by four principles (ACID).
Atomicity: it states that a transaction is either complete or not If any transaction cannot be complete because of problem, it is necessary for the database to go back to state before the transaction, which ensures the atomicity.
Consistency: after each transaction, the database should go from its previous consistent state to another consistent state.
Isolation: every transaction should be isolated and it should have access to the database in a consistent state.
Durability: when a transaction is considered complete, then the database should retain the data even after any hardware or software failure.
> The main advantages of component-based development are the compatibility of the development system with multiple platforms and environments.
> Inadequate software baseline can result in project scope creep.
> An IS auditor reviewing agile project software development can expect to post-iteration reviews that document all the learned lessons.
> Checksum in data is used for integrity testing.
> Transaction journal is responsible for recording transaction activity. Therefore, by comparing the transaction journal with the authorized data source will reveal if there are any unauthorized input from a terminal (a specific computer).
> A console log printout does not record transaction activity from a terminal.
> An automated suspense file only shows the transactions that needs action.
> No modification is allowed once data are in the warehouse.
> A warehouse is just a copy of the original transaction data and it is used for query and analysis.
> The Meta data works as a content table in a warehouse. That is why meta data is considered the most important design element of data warehouse.
> RAD is a management technique.
Level 5: optimizing (continuous improvement).
Level 4: managed (quantitative quality).
Level 3: defined (documented process).
Level 2: repeatable (disciplined management process).
Level 1: Initial (Adhoc, individual effort).
# CMMI (capability maturity model integration)
The purpose of CMMI is to integrate various software maturity models including CMM into a single model. Just like CMM, CMMI also has five maturity levels, but the description of each level is not similar to the CMM. CMMI levels:
Level 5- optimizing (focuses on processes)
Level 4-quantitatively (process is measured and controlled)
Level 3- defined (process is characterized and proactive)
Level 2- managed (process is characterized and reactive)
Level 1- initial (poorly controlled process, which can be unpredictable and reactive)
Business case - Gives the necessary information whether to start a project or not. It is developed from the result of feasibility study, which is done during the project-planning phase.
Software size estimation methods:
1) LOSC (lines of source codes)
2) Function point analysis - It considers the following parameters:
-> Number of user input -> Number of outputs -> Number of user inquiries -> Number of files -> Number of external interfaces
# Time box management
- This project management technique is used to deploy software project within a short and fixed time frame using fixed resources.
- It can be used with rapid application development type projects.
- Advantage: Preventing project cost overrun and delay.
- Project controlling activities: managing project scope, resources and risks.
# Project risks
1) The risks that impact Business benefit: project sponsors are responsible for mitigating this risk.
2) Project risks: project manager is responsible for project risks.
Project risk management process consists of five steps:
1) Inventory risks
2) Assess risks
3) Mitigation risks
4) Discovery risks
5) Review and evaluate
- The errors cause by the unauthorized access is the main problem with online programming methods.
Categories of program debug tools
1. Logic path monitors: identify the errors in program logics
2. Memory dumps: identify data inconsistency in data or parameters.
3. Output analyze: checks the accuracy of the results after execution
- The certification and accreditation process starts after successful completion of final acceptance test.
# Certification process - Assesses standard controls (operational, management, technical) in an information.it examines the level of compliance of policies, standards, guidelines, processes, procedures and guidelines. The goal of certification process is to determine whether the controls are operating correctly, producing expected outcome and meeting the security requirements. The outcome of the certification process helps to reassess and to update the system security plan.
# Accreditation - senior management’s decision which authorize IS operation and accept the risks (risks in IT assets, operation, individuals).
It is considered as a form of quality control, which challenges IS managers and staff to implement highly effective security controls in the organization’s IT systems.
# Changeover (cutover or go-live technique)
This is an approach to migrate the existing users of an old system to a newly developed system. It is also known as cutover since it cut out the users from the old system and move them to the new system.
- Parallel change over: The old system is kept running. Then running the new system, making both the new and old system running at the same time.in this approach the users use both
the system, and this helps to identify any problem that the uses face while using the new system. When users gain confidence in the new system, the full changeover to the new system take place.
- Phased change over: This approach break downs the old system into several deliverable modules. The first deliverable module of the old system is replaced with the first deliverable modules of the new system. Similarly, all the other new modules replace the old modules. Thus, the changeover to the new system takes place.
Risk: IT resource challenges, extended project life-cycle, running change management for the old system.
- Abrupt changeover: On a specific date and time, the old system is changed over to the new system and the use of old system is discontinued.
Risk: assets safeguard, data integrity, system effectiveness and efficiency.
- The main objective of a post implementation review is to assess and measure how much value the project has delivered to the business.
# EDI (electronic data interchange)
- Usually, EDI is used to transmit invoices, shipping orders and purchase orders.
- An EDI system requires the following components:
1. Communication software
2. Transaction software
3. Access to standards
When reviewing an EDI, an auditor should consider:
1. The proprietary version of EDI. Most of the large organization has its proprietary EDI.
2. Publicly available commercial EDI (this approach is less costly but has more security risks)
# Traditional EDI
1. Communication handler: A process handles data transmission in the dial-up lines or in other public networks.
2. EDI interface: it manages and control data path between the communication handlers and the application. The two components of EDI interface are EDI translator (converts data from EDI format to proprietary format) and application interface (used for data movement and data mapping).
3. Application system: A program that process data before sending and after receiving from the trading partners. Web based EDI- used for generic network access.
# EDI risks
1. Transaction authorization(the main risk in a ESD system)
2. Loss of business continuity
3. Deletion or manipulation of transactions
4. Duplicate EDI transmission and data loss.
5. Lose of transaction confidentiality
The IS auditor can verify the evaluation objective of EDI by reviewing the followings:
1. Encryption in place
2. The existence of checks for data editing
3. Validity and reasonability check for each transaction
4. Logging of each inbound transaction.
5. Verifying the number and value of transaction with control totals
6. Using segment count totals
7. Using transaction set count totals
8. Using batch control totals
9. Sender’s validity against other trading partners
Some other EDI audit options are:
1. Audit monitor: it is installed in the EDI computer to capture transactions so as an auditor can review it.
2. Expert systems: it is an audit monitor that can determine the significance of a transaction based on audit rules and prepare a report for the auditors.
# DSS (decision support system)
- DSS mainly focuses more on effectiveness and less on efficiency.
- Prototype is the preferred DSS development and design approach.
- The true evaluation of a DSS is whether it can improve management’s decision-making process.
# Data oriented system development:
A software development method where data and data structures are used to represent software requirements. Elimination of data transformation error is the major advantage of this method.
# Object oriented system development:
A programming technique, not a programming method, where data and the procedures are considered as an entity. It has the advantage of managing unrestricted types of data, modeling complex relationships, can adapt with changing environment.
# Component based development:
An extension of object oriented system development. In this technique, various applications are assembled to deliver their service with defined interfaces. The purpose of the interface here is to facilitate the application programs to communicate with each other regardless of their source language and operating platforms.
Advantages: short development time, programmers can focus more on business functionality of the application, promotes modularity, ability to combine cross languages and codes reusability, less development cost, allows to buy only the components, not a complete solution with features that are not required, that is necessary to the system.
# Application controls
1. Data input
2. Data processing
3. Output function
# Input controls
- Input control is assured by the followings:
A. Input authorization
B. Batch control and balancing
A. Input authorization:
- It ensures that all the data input are authorized and approved by the responsible department or management. Input authorization types:
1. Signature on batch forms or source documents
2. Online access controls
3. Unique passwords
4. Terminal or client work station identification
5. Source document
B. Batch control and balancing
- Batch balancing is all about making sure that each transaction creates files or documents, which are added to the batch, processed and accepted by the system.
Input control techniques
1. Transaction log
2. Reconciliation of data-whether all data received are properly recorded and processed.
3. Documentation
4. Error correction procedures
5. Anticipation
6. Transmitted log
7. Cancellation of source document
# Data processing controls and procedures
The processing controls consist of:
A. Data validation and editing procedures
B. Processing controls
C. Data file control procedures
A. Data validation and editing procedures:
- Input data need to be validated and edited as soon as they are generated. Data validation means finding out data errors, incomplete/missing data and inconsistency in data.
- Edit controls are used before data are processed in order to prevent inaccurate data processing.
-Sequence check
-Limit check
-Range check
-Validity check
-Reasonableness check
-Check digit - A number value is added to the original data to make sure that data has not been altered. It is used to detect transcription and transposition errors.
-Completeness check
B. Data processing controls:
- it ensures that data are complete and accurate. The data processing techniques are manual recalculation, editing and run-on totals, programmed controls, limit checks on amounts, reasonableness verification of calculated amounts, reconciliation of file totals and exception reports.
C. Data file control procedures:
- it helps to make sure that only the authorized processing are done on the data; it will not perform any unauthorized process on data. Content of the data tables or file can be divided into the following categories:
1. System control parameters: any changes in these data can change the way system functions.
2. Standing data: they are not frequently changed. Example: suppliers' names, addresses etc.
3. Master data/balance data: these are current balances and total balances, which are frequently updated by new transaction. Audit trails must be present to preventing changing these data.
4. Transaction logs: these logs are controlled by validation checks, exception report, control totals etc.
Important controls for data files:
Before and after image reporting - The data file before and after the processing need to be recorded to analyze the processing impact on the database.
Error reporting and handling - Those who input the data should not review and authorize the error correction.
Source file retention
Version usage - It is necessary for processing the correct version of the file because older version may not need to run all the procedures.
Data file security - Use to prevent unauthorized access to the data
One-for-one checking - It is used to make sure that all the documents are being processed.
Transaction logs - The activities that need to recorded are input time, username, input terminal/computer details etc. these activity data helps to generate audit trails and can be used to find out errors or warnings and to restore the system if any technical problem occurs.
Parity check - This is used to detect transmission errors in the data. When a parity check is applied to a single character, it is called vertical or column check. In addition, if a parity check is applied to all the data it is called vertical or row check. By using both types of parity check simultaneously can greatly increase the error detection possibility, which may not be possible when only one type of parity check is used.
# Output controls
- Output control ensures consistent and secure delivery of data. The data also need to be presented to the users in proper format. The output controls are:
1. Logging and storage of forms in secure place
2. Computer generated forms and signatures: all the computer-generated forms should be compared with the physical copy of the forms. One should be accountable for any issues, exceptions or unwanted modification of the forms.
3. Distribution of report: the report should be distributed to the person authorized to receive it.
4. Balancing & reconciling: procedures to find out errors in the output report should be established and it should be delivered to the concerned department for review and correction. 5. Output report retention: there should be report retention schedule and the report retention policy should follow legal regulation, if there is any.
6. Report receipt: the recipient of the reports should sign in the record or logbook. It will make sure that the sensitive reports are being distributed properly.
# Tasks of IS Auditor in application controls
1. Identifying the important applications, and its components. Understating the follow of information among the applications or systems and gaining knowledge about the application by reading available documents and interviewing IS personals.
2. Identifying the strength of application controls, and evaluating the impact of those identified weakness.
3. Understanding the functionality of the applications by reviewing the system documentation.
# Data integrity tests
Data integrity tests consist of a number of substantive tests. It aims to test the data accuracy, consistency and authorization.
1. Relational integrity: it is done at the data elements or record levels. Relational integrity can be maintained by building built-in data validation routines in the applications. It can also be implemented in the database by defining input constraints and data characteristics in the tables.
2. Referential integrity: it tests the relationship between entities in the tables in a database. Relational integrity helps to maintain the interrelationship integrity in relational database model (RDBMS).a relational database establish relation among various tables using references between primary and foreign keys. Referential integrity tests make sure that all these references exist in the original table.
# Data integrity in online transaction
- The integrity of online data is maintained by four principles (ACID).
Atomicity: it states that a transaction is either complete or not If any transaction cannot be complete because of problem, it is necessary for the database to go back to state before the transaction, which ensures the atomicity.
Consistency: after each transaction, the database should go from its previous consistent state to another consistent state.
Isolation: every transaction should be isolated and it should have access to the database in a consistent state.
Durability: when a transaction is considered complete, then the database should retain the data even after any hardware or software failure.
> The main advantages of component-based development are the compatibility of the development system with multiple platforms and environments.
> Inadequate software baseline can result in project scope creep.
> An IS auditor reviewing agile project software development can expect to post-iteration reviews that document all the learned lessons.
> Checksum in data is used for integrity testing.
> Transaction journal is responsible for recording transaction activity. Therefore, by comparing the transaction journal with the authorized data source will reveal if there are any unauthorized input from a terminal (a specific computer).
> A console log printout does not record transaction activity from a terminal.
> An automated suspense file only shows the transactions that needs action.
> No modification is allowed once data are in the warehouse.
> A warehouse is just a copy of the original transaction data and it is used for query and analysis.
> The Meta data works as a content table in a warehouse. That is why meta data is considered the most important design element of data warehouse.
> RAD is a management technique.
No comments :
Post a Comment