Most, if not all, data collectors want to maximise the quality of their data and minimise the reporting burden on their filers. Many decisions about how to run a filing program lead to trade-offs between these measures. An example of this is filing rules, where the quantity and complexity of rules must be tuned to deliver the required data quality with an acceptable burden on the filers.
Fortunately, there are some approaches that do not need trade-offs. In this article, we consider practical steps that data collectors can take to both increase data quality and reduce costs at the same time.
Create an environment that allows improvements
Reports of a steady reduction in filing costs  and increase in data quality  at the SEC show how filer-driven market forces can improve value and reduce cost. How fast the market drives down cost and moves toward quality can be significantly affected by the environment created by the data collector, and in particular by how filing program changes are carried out. The most significant of these changes is the release of reporting requirements or taxonomies.
Delays in expected schedules for reporting taxonomies, or the presence of errors and related hotfixes, can create an environment where the market is focussed on the data collector’s process and must invest in solutions that mitigate these risks. On the other hand, if taxonomies are consistently released as planned and filers can rely on them to be correct, then the focus shifts to their own process and improvements.
In order to achieve this, CoreFiling use a process known as Rule and Interface Management (RIM) backed by the Taxonomy Management System (TMS) for all changes. The RIM process is all about the single most important aspect of the change – describing the business data. All the other work, such as build, test, documentation and packaging of taxonomies and changes is automated within the TMS. The simple focus of the process is very effective in allowing change to be delivered predictably and the TMS ensures that the output is 100% error free.
Concentrate on being “standards compliant”
One of the promises of standardisation in business reporting is that standards-based software can be reused for different purposes, even across filing programmes. This means that enhancements, such as those designed to improve data quality, are automatically available to all without additional cost. This is a clear case of ensuring better quality with a lower burden.
Many data collectors lose this key benefit by using interpretations of the XBRL specification and making their own additions on top to solve their own challenges, even where standards-based options are available. These make the filing programme less compliant with the standards and once the market has built software specifically for an individual filing programme then the costs invariably rise and quality features take more time to appear.
For most data collectors, there will inevitably be requirements that are specific to their filing program. To address these in the most standards compliant manner, the solutions must be designed with knowledge and deep understanding of both the specification and its common and accepted use. CoreFiling work with a broad range of global taxonomies and are always actively helping to define XBRL standards and best practice. This ensures that we have the skills and knowledge required to deliver taxonomies that meet business needs with no surprises. Our taxonomy software also allows for selection of predefined architectures for those looking to align to a specific style of reporting to maximise this benefit.
Help the reporting function
External reporting remains, for the time being at least, a largely human occupation. Finding ways to help the people in the filers’ reporting functions get it right is a very powerful way of getting high value data and saving costs across the board. Some approaches that have proven successful in achieving this are given below:
- Pre-filled forms. By pre-filling forms and templates, the task of the reporting function is easier, “confirm” rather than “provide”. The data is also kept consistent with what has already been collected.
- Derived formula. By turning data quality rules around and using them to automatically calculate values in a report, these rules will always be met and the effort of filing can be drastically reduced.
- Flexible formats. Spreadsheets are a great way to collect data where the data is typically found in a filing cabinet. On the other hand, reporting the key details of millions of assets is not a task well suited to spreadsheets. Allowing filers flexibility in reporting formats can avoid data conversions and rekeying and make it as easy as possible to report the data. Converting optional formats back to XBRL prior to validation ensures that data quality is consistent however the data arrives.
To sum up
In order to maximise data quality and minimise the filing burden, data collectors can do more than balance the quantity and strictness of rules. By creating an environment where the filing ecosystem can plan and make safe assumptions, market forces will do much of the hard work. Standards compliance ensures that software is cheaper and has more features. Finally, adding features to a programme that directly helps the people filing is very effective.
If you are interested in seeing how CoreFiling achieve the above and more, then we are currently running hands-on taxonomy workshops using our leading Taxonomy Management System guided by expert consultants. For more information and to book a workshop, visit https://www.corefiling.com/taxonomy-workshop/