During the fieldwork phase, audit evidence is gathered by the auditor/s working methodically through the workplan or checklist, for example interviewing staff, managers and other stakeholders associated with the ISMS, reviewing ISMS documents, printouts and data (including records of ISMS activities such as security log reviews), observing ISMS processes in action and checking system security configurations etc. Audit tests are performed to validate the evidence as it is gathered. Audit work papers are prepared, documenting the tests performed.
The first part of the fieldwork typically involves a documentation review. The auditor reads and makes notes about documentation relating to and arising from the ISMS (such as the Statement of Applicability, Risk Treatment Plan, ISMS policy etc.). The documentation comprises audit evidence, with the audit notes being audit working papers.
Findings from the documentation review often indicate the need for specific audit tests to determine how closely the ISMS as currently implemented follows the documentation, as well as testing the general level of compliance and testing appropriateness of the documentation in relation to ISO/IEC 27001. The results of the audit tests are normally recorded in checklists such as those provided in Appendix A and Appendix B.
Technical compliance tests may be necessary to verify that IT systems are configured in accordance with the organization’s information security policies, standards and guidelines. Automated configuration checking and vulnerability assessment tools may speed up the rate at which technical compliance checks are performed but potentially introduce their own security issues that need to be taken into account*.
The output of this phase is an accumulation of audit working papers and evidence in the audit files.
* Note: automated system security audit tools are powerful utilities but are not appropriate in all environments. They can potentially undermine system security, perhaps introducing additional technical vulnerabilities, extracting highly sensitive information and affecting system performance or availability. Furthermore, auditors using such tools must be competent to use and obtain meaningful data from them: a “pass” from an automated vulnerability assessment tool does not necessarily mean that a system is free of vulnerabilities and is hence secure. A wrongly-configured or ineptly used database security review tool may bring down a production system. Such tools should only be introduced using the organization’s conventional change management processes, including pre-implementation security testing, where appropriate.
The accumulated audit evidence is sorted out and filed, reviewed and examined in relation to the risks and control objectives. Sometimes analysis identifies gaps in the evidence or indicates the need for additional audit tests, in which case further fieldwork may be performed unless scheduled time and resources have been exhausted. However, prioritizing audit activities by risk implies that the most important areas should have been covered already.
for more information please visit AECISO