Risk Analysis
- Conduct risk assessments for AI use cases, including risks related to data, model output, infrastructure, usage context, and user impact.
- Identify ethical, legal, and reputational risks associated with AI systems and recommend mitigation strategies.
Reporting & Communication
- Prepare regular reports on AI governance activities, compliance status, and risk findings.
- Facilitate effective communication between AI governance teams and other departments.
Administrative Support
- Schedule and coordinate AI governance meetings and cross-functional discussions.
- Generate meeting minutes and consolidate materials for governance reviews and working groups.
- Maintain accurate records and documentation related to AI governance activities.
Governance & Compliance
- Assist in the development, review, and maintenance of AI governance policies, procedures, and standards.
- Monitor adherence to governance policies and escalate discrepancies or compliance issues.
- Support staff training on AI governance policies and procedures.
Regulatory Gap Analysis
- Analyze gaps between government regulations/guidelines (e.g., PCPD’s AI Model Personal Data Protection Framework, PDPO) and internal corporate policies.
- Provide recommendations to align internal governance with evolving external regulatory requirements.