The implementation of Long Duration Targeted Improvement (LDTI) introduces a number of changes from the current Generally Accepted Accounting Principals (GAAP) and spans over cross-functional business units. These changes apply to the measurement of specific traditional, non-participating life products as well as their groupings and assumptions. In addition, changes are introduced to the chart of accounts, financial statements, disclosures, and reporting and analytics of LDTI contracts.

This article looks at how the changes in accounting policies affect actuarial measurements and what they look like before and after the implementation.

The Complexity of LDTI for Insurers

Actuarial Measurements – the Impact and Effects

The far-reaching effect of the LDTI implementation will affect the actuarial measurements in many ways including:

In order to understand how the aftermath of this implementation will impact your organization, we summarize and break down the areas involved and the variances incurred before LDTI and after as follows:

Grouping for Reserve Calculation: The current GAAP has groupings at the policy level whereas LDTI has policies grouped in cohorts—quarterly or annual—with each cohort based on similar characteristics.

Current Best Assumptions for Liability for Future Policy: The current GAAP has assumptions locked in at an issue level with a Provision for Adverse Deviation (PAD) and is updated when a loss recognition event occurs.

However, under LDTI, assumptions are reviewed and updated if needed at least once per year, at the same time each year, with no PAD.

Discount Rate Methodology: The current GAAP has a discount rate equal to the insurer’s expected investment yield. With LDTI, all insurance companies will use the same discount rate assumption of an upper-medium grade (low credit risk) fixed-income instrument yield—rated corporate bond yield.

Retrospective Unlocking Approach for Non-discount Rate Assumptions: Under the current GAAP, the Net Premium Ratio (NPR) is locked in based on the expected future cash flows at the issue level.

On the other hand, LDTI uses the revised NPR by using actual historical experience calculated with updated assumptions and discount rates at the issue level. The difference between the previous and revised NPR is used to determine a revised liability, reflected in the current operating period income.

Simplification of the DAC Model: Under the current GAAP, DAC is amortized in proportion to the expected future profitability or premium recognized with the accrual of interest.  Yet with LDTI, DAC is amortized in proportion to the expected life of the contract with no accrual of interest.

Market Risk Benefits (MRBs) Subject to Fair Value Measurement: Under the current GAAP, some guaranteed benefits are valued under an insurance accrual model rather than fair value. Whereas with LDTI, MRBs are measured at a fair value with changes recognized on the income statement.

Increased Financial Statement Disclosure Requirements: Under the current GAAP, the level of disclosure requirements aligns with historical representations and disclosures. With LDTI, there is a significant increase in the disclosure requirement with disaggregated roll-forwards.

Investor and Insurance Companies – the Impact and Effects

Consequences to the changes and volatility introduced from moving to LDTI will affect investors and insurance companies in one way or another.

For instance, investors will now need to recalibrate expectations for the financial position, profitability, and income volatility of insurance companies following the change in the accounting standards from GAAP.

On the other hand, insurance companies should assess the strategic implications of LDTI to their business in order to develop a plan to address the required changes in processes under a narrowing timeline.

This change creates another opportunity for insurers to invest in modernizing finance processes to better understand key drivers of change affecting financial statements. They may also want to engage in strategic divestitures of underperforming business that is not central to their strategy or consolidate a leading position in certain lines of business central to a strategy to strengthen their positions.

Optimus SBR’s Financial Services Practice

Optimus SBR is an independently owned management consulting firm that works with organizations across North America to get done what isn’t. Our Financial Services Group provides strategic advisory services, process improvement services, risk management services, and project management support to leading Financial Institutions, insurers, asset managers, and pension funds.

Contact us for more information on LDTI

Peter Snelling, Senior Vice President, Business Development
Peter.Snelling@optimussbr.com
416.649.9128

This piece was developed in partnership with BDO and Valani Global.

Optimus SBR, BDO, and Valani have come together to establish accelerators for the LDTI journey.  Our accelerators do not only meet the compliance needs of LDTI, but also advance an insurer forward in the areas of financial transformation, operations modernization, and data innovation.

 

Service Partners


With access to a global knowledge base and professional expertise, BDO offers extensive value to their clients across all segments of the insurance and financial services industry.

 

Valani Global supports life insurance companies in achieving their financial risk management goals through implementations of Moody’s Analytics solutions including AXIS and RiskIntegrity for IFRS 17.

 

Toronto June 9, 2022 – Optimus SBR, one of the largest independent management consulting firms in North America, announced their eighth acquisition in a decade with the purchase of n-gen People Performance Inc., a training organization that specializes in solutions for managing generational differences in the workplace.

“I’m excited to announce the purchase of n-gen as a great addition to our Learning & Enablement practice.” said Kevin Gauci, Founder and CEO, Optimus SBR. “We can now offer our clients an enriched suite of training services with n-gen’s uniquely focused workshops that help companies increase employee engagement, drive team performance, and retain and grow a multigenerational workforce.”

n-gen has trained more than 65,000 people in the areas of Leadership, Team Building, Early-in-Career Talent (Gen Z), Sales & Customer Service, and Human Resources. They were the first to define the generational disconnect in the workplace and provide targeted solutions. In addition to their hallmark workshops addressing the challenges and opportunities in multigenerational workplaces, n-gen has conducted research and published work on key leadership behaviours. They deliver training programs addressing issues that are highly relevant in today’s competitive environment. These include leading through change; hybrid team collaboration; recruiting, retaining, and engaging top talent; building resilience, and selling & servicing diverse client groups.

The dynamic n-gen team is eager to offer their roster of training services to Optimus SBR’s clients.

“We’re very excited to be a part of a company like Optimus SBR that shares our entrepreneurial spirit and bold attitude – we make a great team! We welcome the opportunity to deliver n-gen’s training programs to a much broader audience and continue to help organizations improve their performance and productivity” said Giselle Kovary, President and co-founder of n-gen People Performance Inc.

Over the last 10 years, Optimus SBR has been on a path of continuous growth, embracing change and making bold decisions to provide value-driven, innovative solutions for clients. The purchase of n-gen marks the next chapter in its remarkable story.

For life insurers, the introduction of the Long Duration Targeted Improvement (LDTI) standard is often a catalyst for adoption of modern data management practices.  However, compliance can’t be achieved without substantial alignment of supporting IT systems to match new accounting policies, data accessibility, and data analytics/reporting requirements.

For many insurers, this is the time to question the investment in upgrading legacy point-to-point systems, using LDTI as the reason to consider an upgrade to a data fabric that captures all components of the requirement—allowing for alignment of the data flows for the end-to-end process.

Insurers must consider key implementation realities as they compare upgrading their point-to-point system or upgrading to a new data fabric environment.

Complexity of Data Integration – Data integration for most LDTI implementations is quite complex due to the significant footprint of policy, claims, financial, investments, and reinsurance data usually residing in a variety of source systems.

Approach to Data Integration – How data is integrated is a key consideration for the IT department because of its broader impact across the technology and data landscape. Those who combine the integration needs of LDTI with the integration needs of other business objectives are able to achieve economies of scale, leveraging the same integration strategy and platform.

Data Connectivity and Maturity – LDTI presents a catalyst opportunity for IT to move away from a legacy point-to-point approach and adopt a modern hub approach. This facilitates the data connectivity for LDTI and sets the foundation for data maturity advancement, enabling advance analytics outcomes that leverage the potential of artificial intelligence.

Total Cost – When looking at the total cost of ownership, agility, maintainability, advancing data maturity, advancing analytics, and being cloud ready, there’s a clear advantage for a data hub strategy versus a point-to-point strategy.

Revisiting Enterprise Integration – Given the rich dataset required for LDTI, it’s a great opportunity to take a look at your enterprise integration strategy and progressively modernize it through the LDTI initiative.

While many insurers may challenge the cost/benefit of transitioning to newer data fabric as they seek to comply with the new standard, our experience is that building on a point-to-point strategy is just as or more expensive that implementing a new data mesh platform. Such upgrades also do not easily position the insurer to advance their overall data maturity/governance and to set the stage for capitalizing on the potential of advanced analytics.
With the compliance deadline fast approaching, insurers need to actively and quickly examine and execute their compliance plans.  We have established deep expertise in building solutions that immediately solve regulation specific requirements while laying the foundation for solid enterprise data management practices.

We have established a data accelerator known as InsurFabric based on the Denodo platform that accelerates the path to LDTI compliance and advances an insurers data maturity for advanced analytics. The InsurFabric accelerator shortens the timeline of LDTI implementations with data connectivity, workflow processing, a calculation library, and reports aligned with the requirements of LDTI.

Optimus SBR’s Financial Services Practice

Optimus SBR is an independently owned management consulting firm that works with organizations across North America to get done what isn’t. Our Financial Services Group provides strategic advisory services, process improvement services, risk management services, and project management support to leading Financial Institutions, insurers, asset managers, and pension funds.

Contact us for more information on LDTI

Peter Snelling, Senior Vice President, Business Development
Peter.Snelling@optimussbr.com
416.649.9128

This piece was developed in partnership with BDO and Valani Global.

Optimus SBR, BDO, and Valani have come together to establish accelerators for the LDTI journey.  Our accelerators do not only meet the compliance needs of LDTI, but also advance an insurer forward in the areas of financial transformation, operations modernization, and data innovation.

 

Service Partners


With access to a global knowledge base and professional expertise, BDO offers extensive value to their clients across all segments of the insurance and financial services industry.

 

Valani Global supports life insurance companies in achieving their financial risk management goals through implementations of Moody’s Analytics solutions including AXIS and RiskIntegrity for IFRS 17.

 

Long Duration Targeted Improvement, or LDTI, is the most significant change in decades to the existing accounting requirements under U.S. Generally Accepted Accounting Principles (USGAAP) for long duration contracts that are non-cancellable or guaranteed renewable contracts such as life insurance, disability income, long-term care, and annuities.

The ultimate objectives of this accounting standard change are:

There are two groups of LDTI adopters. The complexities of LDTI apply to both:

  1. Early Adopters – These are SEC filers, excluding smaller reporting companies as defined by the SEC, for whom LDTI would be effective for fiscal years beginning after December 15, 2022 and for interim periods within those fiscal years.
  2. Fast Followers – All other entities for whom LDTI would be effective for fiscal years beginning after December 15, 2024 and for interim periods within fiscal years beginning after December 15, 2025. These non-SEC insurers who have a later implementation date and more time to implement will want to be fast followers.

Smaller insurers may believe that they might have a simpler path to compliance; however, they should not underestimate the complexity of the integration LDTI requires between actuarial, accounting, and IT teams, which will require ongoing detailed tracking that will be extremely difficult to do manually.

Insurance industry -impact and implications

This disruption will transform the insurance industry regardless of the type of insurance underwritten. Business functions— from accounting to actuarial—will have to adjust.

Accounting – Accounting for LDTI is not as simple as just adjusting the accounting that currently takes place under current USGAAP. The grouping required for reserve calculations, ongoing tracking and creation, and eventual disaggregated roll-forwards mean the entire end-to-end process requires more details and continuity.  Many insurers will look to track these details through in the form of additional dimensions and attributes to track directly within their general ledger to more easily tie together their financial statements and disclosures.  Mapping those to the data requirements for the new standard is the baseline minimum change.

Data – Data is at the heart of LDTI. Ensuring comprehensive data quality, data controls, and data governance are all at the heart of a functioning LDTI compliance program. Data drives the LDTI accounting engine, is required for traceability and auditing of the results, and will need to flow in a connected manner across your core insurance, actuarial, and financial systems.

Reporting – Minimum compliance reporting requirements under LDTI are well beyond what is required under USGAAP. Generated data from your LDTI accounting engine will feed the minimum reporting requirements. Limiting investment to minimum viable product may be suitable in the short term, but capitalizing on the rich data sets from LDTI can transform an insurer’s business by providing deep customer and business insights to drive market penetration, retention, and profitability.  In the end, the ongoing maintenance of data required for LDTI necessitates automation, as any manual maintenance will continue to become more onerous as your company continues to add new cohorts of business.

IT Systems – Compliance cannot be achieved without substantial alignment of supporting IT systems to match new accounting policies, data accessibility, and data analytics/reporting requirements. For many insurers, this is the time to question the investment in upgrading legacy point-to-point systems using LDTI as the mandate to upgrade to a data hub that captures all components of the requirement—allowing for alignment of the data flows for the end-to-end process.

Actuarial – In many cases, actuarial models will need to be upgraded to be LDTI compliant, adding new components and inputs, as well as providing updated outputs of your best estimated cash flows, containing new components such as risk adjustment.  Alignment between actuals and expected at a granular level is something that LDTI will rely on in order for your LDTI accounting engine to work properly.

Insurance Companies and Investors – Impact and Implications

Consequences to the changes and volatility introduced from moving to GAAP to LDTI will affect investors and insurance companies.

Investors – Investors will need to recalibrate expectations for the financial position, profitability, and income volatility of insurance companies following the change in accounting standard from GAAP.

Insurance Companies – Insurance companies should also assess the strategic implications of LDTI to their business in order to develop a plan to address the required changes in processes under a narrowing timeline.

Bottom Line for Insurers

Insurers can use this change as an opportunity for investment into modernizing finance processes to better understand key drivers of change affecting financial statements.

In addition, insurers may want to engage in strategic divestitures of underperforming business that is not central to their strategy or consolidate a leading position in certain lines of business central to strategy to strengthen their positions.

With the runway to compliance shortening, insurers need to be rapidly examining and acting on their compliance plans. The only question left for most insurers is whether they use the opportunity of LDTI compliance to simply create a minimum viable product or to truly transform their business into the 21st century and take a market leadership position.

Optimus SBR’s Financial Services Practice

Optimus SBR is an independently owned management consulting firm that works with organizations across North America to get done what isn’t. Our Financial Services Group provides strategic advisory services, process improvement services, risk management services, and project management support to leading Financial Institutions, insurers, asset managers, and pension funds.

Contact us for more information on LDTI

Peter Snelling, Senior Vice President, Business Development
Peter.Snelling@optimussbr.com
416.649.9128

This piece was developed in partnership with BDO and Valani Global.

Optimus SBR, BDO, and Valani have come together to establish accelerators for the LDTI journey.  Our accelerators do not only meet the compliance needs of LDTI, but also advance an insurer forward in the areas of financial transformation, operations modernization, and data innovation.

 

Service Partners


With access to a global knowledge base and professional expertise, BDO offers extensive value to their clients across all segments of the insurance and financial services industry.

 

Valani Global supports life insurance companies in achieving their financial risk management goals through implementations of Moody’s Analytics solutions including AXIS and RiskIntegrity for IFRS 17.

 

Optimus SBR has been recognized as a Great Place to Work for a fourth consecutive year! The award exemplifies the bold commitment our team makes every day to uphold Optimus SBR as a place where people come first, and culture is everything.

Optimus SBR demonstrated remarkable resilience, dedication, drive, and creativity during the pandemic and entered 2022 stronger and more engaged than ever. As restrictions lifted, we made a concerted effort to welcome back the team to the workplace that made us a Great Place to Work previously. We re-launched in-person events such as our annual Holiday Party, Thursday Night Birthday Celebrations, Tuesday Talks, Optimus Chats, and Cocktails & Conversations. Extras such as Optimus Days and the Optimus Lite program continued to support work life balance.

This year’s achievement reflects a continued focus on our people and their growth with initiatives ranging from training and career development opportunities to things like our seven, employee created and led committees, and our commitment to diversity and inclusion.

The 2022 Best Workplaces™ in Canada list is compiled by the Great Place to Work® Institute. The award is based on feedback received through a 2021 annual Great Place to Work® survey of Optimus SBR’s people and an in-depth review of our culture which evaluates areas such as diversity and inclusion, onboarding practices, and training and development opportunities. The competition process is employee driven, based on two criteria: 75% percent of each organization’s score is based on confidential employee feedback, from the globally recognized Trust Index® Survey. The remaining twenty-five percent is based on quality, quantity and effectiveness of the programs and policies which support their employees and corporate culture.

 

The explosive growth of data has provided incredible business opportunities but has also presented challenges for many companies, and one of the biggest challenges is ensuring data accuracy. Although we have seen some companies with robust data validation practices, more often companies are in dire need of a systematic and disciplined approach. It is not unusual for business users to tell us they receive inaccurate or questionable data from their data team! The use of “bad” data can significantly impact the performance of the business and in some cases, prove catastrophic.

What is Data Validation, and Why is it Essential?

Data validation, a form of data cleansing, is the process of verifying the accuracy and quality of data before using it.

As a business user, you may not realize that raw data rarely meets an organization’s analytical needs, so it must be manipulated into a form suitable for further analysis. Any data process requiring large volumes of data to be transformed, merged, and cleansed is intrinsically error prone – the more data, the greater the likelihood of error.

For example, when moving and integrating data from different sources and repositories, it may not conform to business rules and may become corrupted due to inconsistencies in type or context.

“Bad” data includes data that is inaccurate, incomplete, inconsistent, duplicated, poorly compiled, or not relevant for its intended use.

“Bad” data can be prevented by following best practices for data validation. The goal is to create data that is consistent, accurate, and complete to ensure the data presented to business users in reports, dashboards, or other tools is correct. When data is consistently accurate, business users trust reports and can confidently make critical decisions based on the right data. If data can’t be trusted, insights from reports or data visualizations can’t be trusted, and ultimately, companies won’t be able to make data driven decisions.

3 Data Validation Best Practices to Prevent “Bad” Data

1.  Start by Verifying Source Data

As a first step, it is critical that the data from each source follows Data Quality measures to ensure that the validation process begins withhigh quality” data.

“High quality” data indicates data meets the needs of the organization in terms of operating, and decision and planning support.

The following Data Quality measures should be verified for each data source:

Accuracy: Are the data records error-free, and can they be used as a reliable source of information?

Data Completeness: Is data complete for all relevant information?

Consistency: Does information in one table match the same information in another?

Timeliness: Is data readily available when the business needs it?

Validity: Does data follow business rules? Business rules are a set of actions or constraints that are applied to data to comply with data quality standards as well as make the data usable and meaningful to non-technical data consumers.

Uniqueness – Do tables consist of unique sets of data or is data repeated among tables?

It is strongly recommended that a proactive approach be taken to identify potential data inconsistencies early to avoid the complexity, cost, and time of having to fix them during later stages of the project.

2.  Data Validation During Integration

Data integration is a process that combines data from multiple sources into a single unified data repository.

Once data quality measures have been verified for all data sources, there are many different transformations, integrations, and aggregations required for large volumes of data within an ETL (Extract, Transform, Load) process.

ETL (Extract, Transform, Load) is a data integration process that collects data from original sources (Extract), cleans and combines it into a format that can be analyzed (Transform), and centralizes it into a target repository (Load).

If any single one of the ETL processes is not developed correctly, the resulting metrics will be inaccurate. This in turn may result in unjustified decision making at the business level. Analytics are only as good as the data that supports it, so it is crucial to implement best practices early when developing ETL workflows.

Optimus SBR Data uses our ADM – Analytical Data Mart to blend multiple, disparate data sources for further analytics and data visualization. Data validation is built into the ADM framework, and each of the ADM’s three tiers – landing, integration, analytics – has a specific purpose and set of validation techniques.

Since a report or dashboard is only as useful as the data that powers it, creating test cases to support data accuracy is crucial in the validation process. Each tier has its own set of test cases that support that tier’s purpose. ETL testing ensures the transfer of data from different sources to a target strictly adheres to transformation rules and remains compliant with all validity checks.

Landing Tier maintains an exact copy of data from the source tables to ensure that a reference of the source is always available.

Test cases: Since the landing layer functions as an exact copy of the original data, the record values and data types must match. The test case for the Landing Tier involves performing a count of records and a list of the metadata of the original and the copy. If these tests are successful, a more detailed test can be performed comparing the values of the original and copy using an ETL tool such as KNIME or Alteryx.

Integration Tier combines the raw data from the landing layer by applying transformations and data structure best practices (e.g., consolidations, aggregations, removal of duplicates).

Test cases: There are several test cases involving the structure of the data that must be created at the Integration Tier to test relationships between fields, tables, and structure.

Data redundancy: Normalization is applied to reduce data redundancy. This divides large redundant tables into smaller tables with a specific purpose and links them using relationships.

Data Integrity: Validation occurs before inserting, updating, or deleting the data. Tests can be performed to determine if the metrics contain any incorrect data (e.g., if sales contain any negative numbers or name fields contain special characters). Another test would be to insert inconsistent data to ensure it fails (e.g., insert a product number with 8 characters when the format is 7).

Parent/child: Data behaviour also requires testing. The database should indicate an error when a child record is inserted before a parent record. For example, when a sale is added that contains a product number that does not exist in the product table.

Cardinality: Involving the relationship of data in one table joined with another table, cardinality refers to whether a relationship is one-to-one, many-to-one, or many-to-many. For example, when testing the relationship between a sales fact table and a product dimension table, many of the products will be the same, but a product dimension table will have unique values for products, so joining these tables will create a many to one relationship.

To test the cardinality, a distinct list of the primary keys (fields designated to identify unique records) is pulled in both tables. The table with the cardinality ‘one’ will have unique values while the table with the cardinality ‘many’ will not. If the product table has duplicate Product IDs, we know that it will involve some cleansing.

Analytical Tier transforms data from the integration layer to create tables and marts that apply specifically to the business. The analytical layer is pulled by a data analysts or data scientist to be used by the business. The function of this layer is to answer real business questions, so it is important to test scenarios that the business will regularly ask.

Test cases: For a retail store, for example, an analyst would create a table with metrics such as sales, margin, and cost that are sliced by dimensions such as region, brand, and product. The resulting information would answer questions like:

What are the sales by region for this year?

What are the 5 least profitable stores this month?

Which stores improved their sales the most compared to last year?

Tests cases are performed that align with how the business slices their data. The answers can then be cross referenced with data from each tier to verify data accuracy.

3.  Automate Tasks for More Efficient Data Validation

When large volumes of data are being validated for analytics, manually sifting through millions of records can not only be error prone but very time-consuming. A great way to increase the efficiency of data validation is to automate tasks using SQL functions. Specific test cases are queried by looking at metrics and comparing the source value and destination value. Queries are named by using a function, so that function can be run when you make changes to the ETL process. Here is a sample output:

A significant delta will indicate which tables and fields are producing problems in the workflow, so they can be rectified. This greatly reduces the amount of manual work, reduces errors, and speeds up the process of data validation.

A Final Thought

Data Validation has become increasingly important and complex with the massive data projects Optimus SBR is seeing. Businesses need to have absolute trust in their data, and decisions must be based on accurate data. The solution is to implement and adhere to a rigorous data validation process that follows the…

3 Best Practices to Prevent “Bad” Data

  1. Verify Source Data: Start with verifying data quality for each data source before beginning the integration processes.
  2. Validate Data During Integration with Test Cases: Employ the ADM – Analytical Data Mart to perform ETL processes. The ADM has built-in data validation techniques at each tier (Landing, Integration, and Analytics) and uses test cases to support data accuracy.
  3. Automate Data Validation: Automate validation tasks to achieve greater accuracy and efficiency.

Optimus SBR’s Data Practice

Optimus SBR provides data advisory services customized to support the needs of public and private sector organizations. We offer an end-to-end solution, from data strategy and governance to visualization, insights, and training. Click here for more information on our Data practice and how we can help you on your data journey.

Great Place to Work® has just announced that Optimus SBR is on the 2022 list of Canada’s Best Workplaces™ for Hybrid Work! We proudly add this to our roster of awards which include Best Workplaces™ in Canada, Best Workplaces™ in Ontario, Best Workplaces™ in Professional Services, Best Workplaces™ for Giving Back, and Best Workplaces™ for Today’s Youth!

At Optimus SBR we recognize the value of work/life balance and that it doesn’t look the same for all employees, so finding balance is easier in a flexible work arrangement. Although a flexible work environment for our employees has always been an area of focus, the unprecedented challenges presented by the pandemic really brought it to the forefront.

Optimus SBR reacted quickly to the Covid restrictions and introduced new programs and supports to help our employees adapt to a virtual work environment.

Optimus Days, launched in January 2020, gives all of our employees one company-wide day off per month. This allows everyone additional time to focus on wellness. Optimus Days have been particularly important and appreciated given the extra pressures and challenges associated with the pandemic.

Optimus Lite, which means “no internal meetings” between 12:00 – 1:30 pm daily or Friday afternoons, was introduced in 2020 to reduce virtual screen fatigue. This gives individuals extra time for childcare responsibilities, a chance to get a break from their computers to exercise or get outside, or to have uninterrupted focus time.

Optimus Etiquette Guide helps employees by providing guidelines for how best to interact and connect with people in a virtual environment.

Get Comfortable, introduced during the pandemic, has offered employees the opportunity to borrow equipment from the office, such as a computer monitor or an ergonomically designed chair, to make their home workplace more comfortable.

Virtual Events were held to provide fun, camaraderie, engagement, and make employees feel more connected while working at home.

Lounging with Leaders – interactive Q&A sessions hosted by members of our Executive Leadership Team to get to know them on a more personal level.

Tuesday Talks – themed panel discussions highlight members across different areas of the organization. Panels have included proposal writers, employees with 10+ years of service, Principals, and individuals who have been actively involved in charitable activities within the community.

Happy Hour Hang – these monthly events provided an opportunity for employees to come together virtually to chat, have some fun, and play games. There was something for everyone – Jeopardy, Finish the Lyrics, Charades, Caption This, and Wingin’ It.

Summer Palooza – the pandemic has been difficult for everyone, so we wanted to give lighten things up in the summer with a 10-week swag giveaway event where someone in our organization was gifted a prize on a weekly basis, at the end, every employee was gifted a swag box filled with summer-related goodies like a frisbee, t-shirt, and JBL Bluetooth speaker.

Beyond accommodating employees in a hybrid work environment with equipment for their home offices and virtual events to stay connected, Optimus SBR has addressed the huge toll the pandemic has taken on mental health.

Building on everything we have already achieved for mental wellness in the workplace, we have taken our commitment to the next level. One key initiative has been Optimus Chats, a program where employees are paired with a different colleague each month for a 15-minute chat.

As Covid restrictions have lifted and employees have started to return to the office on a part-time basis, Optimus SBR has put concerted effort into creating a workplace that people want to go back to.

We’ve developed a staggered and flexible return to office program which has included team building lunches and meet and greet cocktail parties for new hires. Our hybrid work environment will no doubt continue to evolve as we continue to seek feedback from our employees on what hybrid workplace model will best meet their needs.

For more information please visit www.greatplacetowork.ca.

Optimus SBR is honoured to be recognized on the 2021 list of Best Workplaces™ for Giving Back in Canada. We feel fortunate to be a part of an organization that has built a culture around putting people first, both within our organization and the greater community. Congratulations go to our Corporate and Social Responsibility Committee, Volunteers, and Donors!

Optimus SBR gives back to the community with the same enthusiasm and entrepreneurial spirit that has consistently earned us the distinction of being named one of the Best Workplaces™ in Canada. At Optimus SBR we are committed to giving back because we are passionate about building a stronger community, improving social equity, helping those in need, empowering others to make change, and being a socially responsible organization. Since the inception of our Community and Social Responsibility Committee in 2012, we have supported over 40 different organizations, and raised and donated over $600,000!

Over the last year, the pandemic contributed to a sharp increase in demand for charitable services. With this surge of those in need, we weren’t deterred by the transition to the new work-from-home reality or the limitations imposed by the pandemic. We doubled down our efforts supporting 11 organizations, contributing 976 volunteer hours, and raising and donating $80,187! The Fort York Food Bank and the RBC Sunnybrook Race for the Kids were among the charities we supported, and we hosted our first ever Virtual Pay It Forward Fundraiser raising over $23,000 in a single night.

The 2021 Best Workplaces™ for Giving Back in Canada list is compiled by the Great Place to Work® Institute. The Great Place to Work® Institute compiles the Best Workplace lists by gaining direct feedback from employees from the hundreds of organizations that are surveyed. To be eligible for this list, organizations must be Great Place to Work-Certified™ in the past year and at least 90% of employees must feel good about the way their company contributes to the community. The institute determines the BEST by the overall Community Investment Index score from employees as well as the range and quality of programs that encourage workplace community investment.

For more information please visit www.greatplacetowork.ca.

If your organization is having difficulty deriving greater insight, understanding, and intelligence from your data, you are not alone.  Data is everywhere. But accessing, organizing, and making sense of it can be daunting.

Better data means better decisions

It’s clear that readily available, up-to-date, data-driven intelligence leads to better decisions and ultimately to better business performance. Yet many companies are far from having business intelligence that can inform strategic decision-making across the company. This is often due to their inability to access and integrate data from multiple source systems. It’s not as simple as importing data and analyzing it. Rules must be established that govern who can do that, how it’s to be done, and what questions should be asked of the data.

Although most companies attempt to take advantage of data repositories, results vary. The least sophisticated environments have data scattered across different, disconnected, spreadsheets and documents. Employees may not get the information they need because it’s not organized or available in a consumable format. Furthermore, the data can’t easily be shared, and no one is in charge of data governance. Environments at the other end of the maturity spectrum have systems in place to ensure that data is organized, curated, and accessible to all levels of staff in the form of actionable dashboards.

The ultimate goal for decision makers is to have trusted data at their fingertips that will facilitate information-based decisions. The first step in becoming more data driven is to build a roadmap indicating where you are in terms of the organization’s business intelligence maturity and where you want to be. The roadmap should reveal what you are doing well and what areas need improvement, where you need to go next, and how to get there.

UBIR Business Intelligence Roadmap

There are many business intelligence maturity models available that help you identify how mature your data and analytics strategy is, and where you eventually want to be. But they often lack guidance on how to get there. Our team of data experts created UBIR, a Business Intelligence Roadmap, in response to customers who asked us for a maturity model that would show them what they need to do to get to the next stage of business intelligence maturity.

Our Secret Sauce

UBIR is different than other maturity models in that it’s an interactive roadmap that is task-based, detailing specific assignments and accomplishments to reach the next phase of maturity. Tasks and functions are either completed or not, and the next milestone is clearly outlined.

Click here to explore this interactive roadmap

UBIR directs the user through five progressively advanced phases of activities and implementations – Basic, Tactical, Focused, Strategic, and Transformational. They are intersected by People, Process, and Technology. It’s critical to recognize that all three of these elements are required because the diminished standing of one will cause BI systems to fail.

A Final Thought

If the ultimate goal for decision makers is to have trusted data at their fingertips to facilitate information-based decisions, then an action-oriented guide to get them there is a critical first step in that journey. UBIR helps organizations create a detailed roadmap to progressively increase their business intelligence maturity, allowing them to better leverage data to make smarter decisions.

Optimus SBR’s Data Practice

Optimus SBR provides data advisory services customized to support the needs of public and private sector organizations. We offer an end-to-end solution, from data strategy and governance to visualization, insights and training. Click here for more information on our Data practice and how we can help you on your data journey.

Power BI and Tableau are the most popular business intelligence tools on the market. Both deliver exceptional dashboards and contain all the basic capabilities an organization would need. Although the tools are similar, there are key differences that organizations should be aware of when considering analytical requirements.

Performance & Deployment

Tableau

Because Tableau is a columnar based structure, it can fetch and process billions of rows without impeding performance; users can also leverage Hyper extracts to process large and/or complex data sets. The downside is that although Tableau offers cloud-hosted solutions with Tableau Online and Tableau CRM, it does not have a cloud-native architecture. On-premises customers can experience limitations in their scaling solutions such as the inability to use the cloud’s elasticity for dynamic workloads. Yet, if deployment flexibility is important, Tableau has more cloud-based and on-premises options than Power BI.

Power BI

Power BI does not handle bulk data as well as Tableau and does impose a limit on the amount of data set storage. With a Pro license there is a limit of 10 GB; larger data sets require a Power BI Premium license. That said, 10 GB should suffice for most users. As for deployment, Power BI does offer on-premises and cloud versions, with Azure available only for cloud deployments.

Data Preparation

Both Tableau and Power BI offer a tool to transform data prior loading it for visualization.

Tableau

With a Creator licence, Tableau includes a stand-alone application called Tableau Prep Builder to profile, analyze, clean, and execute common data preparation tasks. Users with minimal technical experience can quickly learn the required skillset thanks to its smart features.  These range from data cleaning to visual layout of the data preparation process, similar to other Extract, Transform and Load (ETL) tools on the market.

Power BI

Data preparation in Power BI is integrated in the desktop application itself using Power Query, just like other Microsoft products such as Excel. Hence, it provides a more seamless experience than Tableau and users do not need a paid license. Power Query Editor offers very similar ETL capabilities to Tableau Prep Builder, yet users might find it less intuitive when numerous queries are required, since there is no visual layout of the process itself. However, it can perform more advanced ETL operations by allowing users to write their own queries using Power Query Formula Language (M).

Learning Curve

Tableau

Tableau’s minimalistic layout can require some adjustment for beginners as the options are not obvious. Users with no prior BI experience might need some basic training to understand the fundamentals. However, Tableau’s learning curve is not significantly steeper than Power BI’s and some users might prefer Tableau’s more intuitive Visual Query Language versus Power BI’s DAX language.

Power BI

Power BI is generally considered easier to learn than Tableau because its layout is similar to other Microsoft Office 365 applications. The top ribbon is familiar to Excel users which allows them to easily navigate between features and capabilities. Creating a visualization can also feel more straightforward by choosing the type of visualization first, then dragging and dropping fields to identified inputs.

Data Visualization

Tableau

Tableau facilitates the creation of more functional and aesthetic data visualizations. When comparing visualizations in Tableau Public Gallery with Power BI Data Stories Gallery, it is apparent that Tableau delivers more impressive looking data visualizations even when large amounts of data points are involved.

Power BI

For most users though, Power BI satisfies their reporting needs and delivers impactful dashboards.

 

Sharing and Collaborating

In both cases, dashboards and reports can be uploaded to a web service viewed by users either within the organization or to external users. The control of access, the option to subscribe to reports (as to stay up-to-date) or the option for multiple users to edit the reports online are features included in both tools.
Where they differ is the ease of collaborating and sharing between teammates. Because Power BI can integrate with other Microsoft products such as SharePoint and O365, organizations already using Microsoft products and services, can find collaboration more natural and less disruptive, as opposed to a transition to a new product such as Tableau. For example, users can connect to excel files hosted on SharePoint, share reports using Teams and embed them on SharePoint documents for internal consumption.

Cost

Tableau

Tableau licences follow a straightforward monthly rate based on the type of licence and deployment. For a deployment fully hosted by Tableau Online, the pricing is slightly more expensive than for one that only includes Tableau Server.

Power BI

Overall, Power BI is considerably less expensive than Tableau and as such is a more affordable option for smaller organizations. However, Tableau’s premium pricing might be justified by it’s support for large data and additional functionalities that Power BI does not have.

There is only one type of individual licence and Power BI pricing depends on whether Power BI Premium is enabled. In sum, Power BI Premium is a capacity-based license for organizations requiring greater scale and performance for their content. It does offer additional features and capabilities, notably the ability to share Power BI content with anyone without purchasing a per-user license.

Both tools offer free options, but their limitations differ. With a free license in Power BI, although users have access to all the software’s capabilities, they can’t collaborate or share their reports. Conversely, with Tableau Public, Tableau’s free option, all published data sources and dashboards are be open to the public.

Machine Learning

Tableau

In March 2021, Tableau introduced Einstein Discovery, a new feature capable of creating predictive models. This was a long anticipated and overdue feature to rival Power BI.

Power BI

Microsoft has made great strides in introducing machine learning capabilities to users that are not necessarily tech savvy by minimalizing required programming skills. For instance, with Premium capacity, users can apply Microsoft machine learning models for text and visual analytics within the desktop app, with a few clicks. Power BI also enjoys deep integration with other Microsoft services such as Azure Machine Learning and SQL Server Base Analysis Services. These features cost extra but can be the right approach for organizations looking to apply simple or common machine learning models quickly.

Support

Tableau

Tableau has a knowledge base for most common “how-to” and troubleshooting issues. That said, it is with its strong online community that Tableau outshines Power BI. Tableau users tend to be more passionate about data visualization than their counterpart, perhaps due to the better end user experience and more aesthetic visualizations. Tableau’s longer presence on the market also helps in creating a more active community.

Power BI

Support for Power BI continues to improve with it growing popularity and increase in content and organized community events.

Summary

Performance & Deployment: Tableau can handle large volume of data and has more options for deployment than Power BI.

Data Preparation: Power Query can be used in Power BI and is within the desktop application itself. Tableau has a stand-alone application that is easier to use but less seamless.

Learning Curve: Power BI is easier and faster to learn due to its familiar Office 365 layout, but Tableau has Visual Query Language which is more intuitive that Power BI’s DAX.

Data Visualization: Tableau has an edge on creating more attractive visualizations than Power BI and offers more advanced features.

Sharing and Collaborating: Organizations adopting Microsoft products and services in their workflows will it find easier to share and collaborate with Power BI. If Salesforce’s products and services are used, Tableau would be a better option.

Cost: Power BI is more affordable for small teams. Tableau can be a better option for larger organizations.

Machine Learning: Power BI has more advanced machine learning capabilities due to its deep integration with other Microsoft platforms, such as Azure Machine Learning.

Support: Tableau has a stronger support community because it was developed before Power BI. However, online content can easily be found for either, since these are the most widely used BI tools today.

A Final Thought

While this article provides a good overview of Tableau versus Power BI, there are many different factors that organizations must consider when sourcing a BI tool. Our team of consultants can guide you through the process, making sure the right questions are being asked to come to the solution that makes sense for your business.

Optimus SBR’s Data Practice

Optimus SBR provides data advisory services customized to support the needs of public and private sector organizations. We offer an end-to-end solution, from data strategy and governance to visualization, insights and training. Click here for more information on our Data practice and how we can help you on your data journey.

Optimus Think


PreviousPrevious