All Posts By

James Beresford

Profisee

Profisee partner with Talos

By | Data Platform | No Comments

Profisee + Talos – Partnership Announcement

Profisee have formally partnered with Talos to help deliver MDM (Master Data Management) solutions to Australian business and enhance your Modern Data Platform.

Profisee’s Master Data Management software makes it easy and affordable for companies of all sizes to build a trusted foundation of data across the enterprise. Talos’ expertise ensures a successful implementation that delivers value to the business.

Why is an MDM solution important?

The key benefit of an MDM solution is to improve the quality of your data when it is held in multiple systems. This can quickly yield benefits by simple reduction of duplication of effort – for example reducing the number of mail recipients on a corporate Christmas card list!
An MDM solution is agnostic to the source systems but can integrate back to them to keep data clean. In plain English this means that you can independently manage the data that resides in multiple systems, but allow the MDM solution send cleaned data back to them.

A good tool supports ongoing stewardship and governance through workflows, monitoring and corrective-action techniques that can all be managed by an end user.

Why Profisee?

Profisee is ideal for companies looking to get started with MDM and seeking an easy-to-use, cost-effective and rapidly deployed MDM solution at an affordable price point. The price issue is significant and many enterprise MDM tools come attached with significant price tags and services overhead which may not suit many businesses.
According to Gartner, customers report Profisee offers an economically priced offering and favorable TCO. Combined with shorter implementation time frames, where Profisee has more implementations taking under three months than any other vendor in the 2020 Magic Quadrant, this makes it an ideal tool for initial MDM implementations.
Contact us to speak with one of Talos’s experts.
Contact us today







    The 3 faces of Enterprise PowerBI – Training Personas

    By | Uncategorised | No Comments

    PowerBI Training Personas! Since self service BI became an actual thing we have advised many organisations on how to roll it out successfully and give the best ROI and understanding the users is essential. A key mantra for us has always been to tune the content to the audience. After all, you book The Wiggles for your children’s party, not a heavy metal band (well, unless they are into Hevisaurus!).

    Over the course of working with organisations to define these audiences, consistently there are 3 personas that come up, each with their own needs with regards to enablement in terms of data and training. These 3 personas are:

    • The Actor
    • The Creator
    • The Analyst

    Lets have a quick walk through these PowerBI Training Personas!

    The Actor

    This user makes up the bulk of users in most organisations. Their use of data and reporting is as an input to drive or inform decision making. The individuals in these roles can range from the CEO who needs to have an at-a-glance dashboard of their organisations performance, to a field sales worker who needs to know the buying profile of their next client.

    The level of interactivity with any reporting will be basic – maybe selecting a few options or filters, perhaps drilling down a preset path to get some finer detail. Consequently the degree of training and enablement they need is fairly light. Key information for them is where to find the report, what the data on the report actually means and what buttons to press.

    The Creator

    This type of user is actually one of the most important in terms of organisational impact. These are the people that are tasked with generating content for the Actors, and as such have a deep understanding of the data in their domain. These are the people tasked to work with the technology experts to build out the data models that drive much of the self service capability.

    These users really get into the guts of the data and understand it in depth. When an Actor asks for explanation on a particular data point they are the ones that have to investigate. The technical training they need focuses on content creation and publishing. The enablement needs to cover things like business analysis skills, Master Data Management and Data Cataloguing.

    The Analyst

    Most people calling themselves PowerBI experts sit here, and most organisations have a handful of them – they are not a big population (though a vocal one!). They sometimes fill the role of creator but more often than not are trying to make sense of the organisations data, how it interlinks and where the hidden treasure lies. They “self-serve” from the raw data, constructing new ways to get insight into the organisations performance.

    Here enablement focuses on technical capability as they need to understand how to manipulate and interrelate data that may be in less than ideal forms and certainly hasn’t been through an enterprise cleaning process. Data Catalogues support them in the discovery process.

    To wrap

    The key message to take from this post is that when rolling out PowerBI at scale these different communities and capabilities need to be addressed – and in different ways. There is no value in sending all of your team on advanced PowerBI training if the skills learned will never find a practical application. Similarly if you build it, they won’t come – you need to guide them there.

    Good luck!

    (Adapted from an original LinkedIn article by the author)

    Azure ML PowerBI

    By | AI & ML, Data Visualisation | No Comments

    Leveraging Azure ML Service Models with Microsoft PowerBI

    Machine Learning (ML) is shaping and simplifying the way we live, work, travel and communicate. With the Azure Machine Learning (Azure ML) Service, data scientists can easily build and train highly accurate machine learning and deep-learning models.  Now PowerBI makes it simple to incorporate the insights from models build by data scientists on Azure Machine Learning service and their predictions in the PowerBI reports by using simple point and click gestures. This will enable business users with better insights and predictions about their business.

    This capability can be leveraged by any PowerBI user (with an access privilege granted through the Azure portal).  Power Query automatically detects all ML Models that the user has access to and exposes them as dynamic Power Query functions.

    This functionality is supported for PowerBI dataflows, and for Power Query online in the PowerBI service.

    Schema discovery for Machine Learning Service models

    Unlike the Machine Learning studio (which helps automate the task of creating a schema file for the model), in Azure Machine Learning Service Data scientists primarily use Python to build and train machine learning models.

    Invoking the Azure ML model in PowerBI

    1. Grant access to the Azure ML model to a Power BI user: To access an Azure ML model from PowerBI, the user must have Read access to the Azure subscription. In addition:
    • For Machine Learning Studio models, Read access to Machine Learning Studio web service
    • For Machine Learning Service models, Read access to the Machine Learning service workspace
    1. From the PowerQuery Editor in your dataflow, select the Edit button for the dataset that you want to get insights about, as shown in the following image:
    Azure ML PowerBI Edit Dataset

    Azure ML PowerBI Edit Dataset

     

    1. Selecting the Edit button opens the PowerQuery Editor for the entities in your dataflow:
    Azure ML PowerBI PowerQuery

    Azure ML PowerBI PowerQuery

     

    1. Click on AI Insights button (on the top ribbon), and then select the “Azure Machine Learning Models” folder from the left navigation menu. All the Azure ML models appear as PowerQuery functions. Also, the input parameters for the Azure ML model are automatically mapped as parameters of the corresponding PowerQuery function.
    Azure ML PowerBI AI Insights

    Azure ML PowerBI AI Insights

    1. To invoke an Azure ML model, we can specify the column of our choice as an input.

     

    1. To examine/preview the model’s output, select Invoke. This will show us the model’s output column, and this step also appears (model invocation) as an applied step for the query.
    Azure ML PowerBI Invoke

    Azure ML PowerBI Invoke

    Summary

    With this approach we can integrate all ML models (built using either Azure ML service or studio) with PowerBI reporting. This enables business to effectively utilise the models built by data scientists by any user (typically BI analyst) for relevant datasets based on the problem we are trying to solve (either classification/regression) or to get predictions. Utilising all these new enhancements of Microsoft PowerBI will enlighten business users with better insights and this in turn aids in better decision making.

    Let our Data Visualisation and Machine Learning experts help you explore the potential – contact us today!

    Data & AI Strategy metrics

    By | Data & AI | No Comments

    Why are Data & AI strategy metrics important? The beauty of “strategies” for some is that a strategy – unlike a tactic – often doesn’t come with any clear success / fail KPI’s. It allows a lot of wriggle room for ambiguous assessments of whether it worked or not. However any self-respecting Data & AI strategy should not allow this. After all, it is designed and executed in the name of improving the use of data and measurable outcomes within an organisation. A good Data & AI strategy should have measures to determine its success.

    Data & AI Strategy metrics that matter

    Commonly raised metrics are based around uptake and usage (software vendors are particularly fond of these). This seems based on the hope that the apparent usage of tools is inherently a good thing for a company that will somehow lead to – I don’t know – increased synergy?

    Dilbert Utilising Synergy

    Dilbert Utilising Synergy

    Sometimes they are measured around data coverage by the EDW or project completion.  However, if I was to put my CEO hat on, I would want to know the answer to the question “how are all these Data & AI users improving my bottom line?”. After all, if the Data & AI tools are being heavily used, but only to manage the footy tipping competition, then I’m not seeing a great deal of ROI.

    The metrics that matter are the Corporate metrics.

    A good Data & AI Strategy should be implemented with a core goal of supporting the Corporate strategy, which will have some quantifiable metrics to align to. If not, a good Data & AI strategy isn’t going to help you much as your organisation has other problems to solve first!

    In a simple case, imagine a key part of the strategy is to expand into a new region. The Data & AI strategy needs to support that by providing data & tools that supports that goal, enabling the team in the new region to expand – and should be measured against its ability to support the success of the Corporate strategy.

    This is why at FTS Data & AI, our first step in defining a Data & AI Strategy for an organisation is to understand the Corporate strategy – and its associated metrics – so we can align your Data & AI strategy to it and create a business case to justify why you need to embark on a Data & AI strategy in the first place. The metrics are the foundation that prove that there is deliverable value to the business. This is why the Corporate Strategy sits at the top of our Strategy Framework:

    Data & AI Strategy Framework

    Data & AI Strategy Framework

    We have extensive experience designing strategies that support your business. Contact us today to speak with one of our experts.

    Data Quality: Enter the 4th Dimension

    By | Data Platform | No Comments

    Data quality is a uniform cause of deep pain in establishing a trusted data platform in Data & AI projects. The more systems that are involved the harder it gets to clear it up, before you even start accounting for how old they are, how up to speed the SME’s are, how poor front end validation was – there’s a host of potential problems. However something tells me that the number of projects where the customer has said that it’s OK if the numbers are wrong is going to remain pretty small.

    Scope, Cost, Time – Choose one. But not that one.

    Project Management Triangle

    Data Quality is a project constraint

    Many of you will be familiar with the Project Management Triangle which dictates that you vary two of Scope, Cost or Time to fix the other. The end result being that in the middle, Quality gets affected. For most Data & AI projects I have found cost and time tend to be least negotiable, so scope gets restricted. Yet, somehow Time and Cost get blown out anyway.

    Whilst Data & AI is hardly unique in terms of cost and schedule overruns, there is one key driver which is neglected by traditional methods. Leaning once again on Larissa Moss’s Extreme Scoping approach, she calls out the reason. It’s because in a Data & AI project, Quality – specifically Data Quality – is also fixed. The data must be complete and the data must be accurate for it to be usable – and there is no room for negotiation on this. Given that the data effort consumes around 80% of a Data & AI projects budget, this becomes a significant concern.

    How do we manage Data Quality as a constraint?

    We have to get the business to accept that the traditional levers can’t be pulled in the way they are used to and that requires end user education. The business needs to be made aware that it is a fixed constraint – one that they are imposing, albeit implicitly. The business has to accept that if Quality is not a variable, then the three traditional “pick two to play with” becomes “prepare to vary all of them”.  Larissa Moss refers to this as an  “Information Age Mental Model” which prioritises quality of output above all else.

    Here is where strong leadership and clear communication comes into play. Ultimately if one business demands a certain piece of information the Data & AI project team will have to be clear to them that to obtain that piece of data to the quality which is mandated, they must be prepared to bear the costs of doing so, including the cost of bringing it up to a standard that means it is enterprise grade and reusable, so that it integrates with the whole solution for both past and future components of the system. This of course does not mean that an infinite budget is opened up to deal with each data item. Some data may not be worth the cost of acquisition. What it does mean is that the discussion about the costs can be more honest, and the consumer can be more aware of the drivers for the issues that will arise from trying to obtain their data.