Transparency Act

Done.ai's due diligence under the “Transparency Act” (“Åpenhetsloven”)

Done.ai takes human rights seriously and complies with the requirements of the Norwegian “Transparency Act” (lov om virksomheters åpenhet og arbeid med grunnleggende menneskerettigheter og anstendige arbeidsforhold). The “Transparency Act” gives the public the right to access information on how Done.ai handles negative consequences for basic human rights and decent working conditions. The main purpose of the law is to promote companies' respect for fundamental human rights and decent working conditions in connection with the production of goods and the provision of services. Done.ai's due diligence is published in its entirety on this page.

Done.ai – responsibilities, guidelines and routines

  • Done.ai’s vision is to enable a more integrated and automated financial ecosystem through innovative technology and embedded banking services. The company delivers digital financial infrastructure that streamlines and automates processes across accounting, payments, and banking. At the time of writing, Done.ai operates primarily in Norway, and the overall responsibility lies with CEO Staffan Herbst.

  • The company has developed its own routines for reporting and dealing with actual and potential negative consequences for basic human rights and decent working conditions.
  • Done.ai requires that Standards of Business Conduct are followed by its own companies, suppliers, and all business partners. Potential negative consequences for fundamental human rights lie primarily in the use of subcontractors. Done.ai’s Information Security System is based on ISO 27001 and sets strict requirements for regular risk assessment of subcontractors. Assessment of the subcontractors' work in relation to human rights is part of this risk assessment. The company also conducts annual due diligence assessments in relation to the OECD's guidelines for multinational companies.

Result of due diligence assessment

  • In its due diligence assessment, Done.ai has not identified any significant risk or revealed any negative consequences for fundamental human rights as a result of the company's work.

Measures to limit negative consequences

By having a conscious focus, as well as clear reporting channels for employees and external environments, the company believes that the risk of negative consequences for fundamental human rights will be minimal. In addition to this, employees are continuously made aware of their responsibilities in areas related to this topic. Done.ai considers that the information about measures the company has implemented or plans to implement is sufficient and limits the risk of negative consequences. Done.ai will continuously assess situations that arise and the extent to which the measures achieve the desired effect.

Do you have questions about Done.ai’s due diligence assessment? Please contact Done.ai in writing at support@done.ai

Last review
25.03.2025