Microsoft Gold Partners Data Analytics

By | Data & AI | No Comments

We are excited to announce that Talos have now become Microsoft Gold Partners in Data Analytics! This is an important development in our ongoing partnership with Microsoft in Australia. It shows we have reached the high standards expected of partners that qualify for the program. The Data Analytics competency is one that shows we have a deep expertise in the associated technologies – SQL Server, PowerBI and other Azure capabilities such as Azure Data Factory.

Talos Microsoft Gold Partners Data Analytics

Talos Microsoft Gold Partners Data Analytics

Microsoft are a key partnership for us as they provide an industry leading platform for delivering Data Analytics solutions to our customers. Us achieving this partnership level is a demonstration of the capabilities of our technical team. It is also a recognition of the customer successes we have helped drive.

What does Microsoft Gold Partners in Data Analytics mean for customers?

Becoming a Gold Partner in Data Analytics has customer benefits in terms of confidence in our expertise and access to the latest technology.

For our customers it demonstrates that they are working with a technology partner that has certified personnel, with access to the latest in training and product updates. We have to maintain a number of certified people on our team who are qualified in the Data Analytics platforms that Microsoft provide. These certifications cover on premise and cloud technologies and range from traditional Business Intelligence through to Big Data solutions. Microsoft Gold Partners also get access to cutting edge product training as we are encouraged to upskill on the latest platforms.

Gold Partners also benefit from product information & roadmaps and industry insights direct from the Microsoft Data Platform team. As the product suite is updated we receive access to product news and information. We are also given Azure Credits so we can trial the new capabilities as part of our commitment to keeping our customers informed and aware of the latest developments.

Profisee

Profisee partner with Talos

By | Data Platform | No Comments

Profisee + Talos – Partnership Announcement

Profisee have formally partnered with Talos to help deliver MDM (Master Data Management) solutions to Australian business and enhance your Modern Data Platform.

Profisee’s Master Data Management software makes it easy and affordable for companies of all sizes to build a trusted foundation of data across the enterprise. Talos’ expertise ensures a successful implementation that delivers value to the business.

Why is an MDM solution important?

The key benefit of an MDM solution is to improve the quality of your data when it is held in multiple systems. This can quickly yield benefits by simple reduction of duplication of effort – for example reducing the number of mail recipients on a corporate Christmas card list!
An MDM solution is agnostic to the source systems but can integrate back to them to keep data clean. In plain English this means that you can independently manage the data that resides in multiple systems, but allow the MDM solution send cleaned data back to them.

A good tool supports ongoing stewardship and governance through workflows, monitoring and corrective-action techniques that can all be managed by an end user.

Why Profisee?

Profisee is ideal for companies looking to get started with MDM and seeking an easy-to-use, cost-effective and rapidly deployed MDM solution at an affordable price point. The price issue is significant and many enterprise MDM tools come attached with significant price tags and services overhead which may not suit many businesses.
According to Gartner, customers report Profisee offers an economically priced offering and favorable TCO. Combined with shorter implementation time frames, where Profisee has more implementations taking under three months than any other vendor in the 2020 Magic Quadrant, this makes it an ideal tool for initial MDM implementations.
Contact us to speak with one of Talos’s experts.
Contact us today







    The 3 faces of Enterprise PowerBI – Training Personas

    By | Uncategorised | No Comments

    PowerBI Training Personas! Since self service BI became an actual thing we have advised many organisations on how to roll it out successfully and give the best ROI and understanding the users is essential. A key mantra for us has always been to tune the content to the audience. After all, you book The Wiggles for your children’s party, not a heavy metal band (well, unless they are into Hevisaurus!).

    Over the course of working with organisations to define these audiences, consistently there are 3 personas that come up, each with their own needs with regards to enablement in terms of data and training. These 3 personas are:

    • The Actor
    • The Creator
    • The Analyst

    Lets have a quick walk through these PowerBI Training Personas!

    The Actor

    This user makes up the bulk of users in most organisations. Their use of data and reporting is as an input to drive or inform decision making. The individuals in these roles can range from the CEO who needs to have an at-a-glance dashboard of their organisations performance, to a field sales worker who needs to know the buying profile of their next client.

    The level of interactivity with any reporting will be basic – maybe selecting a few options or filters, perhaps drilling down a preset path to get some finer detail. Consequently the degree of training and enablement they need is fairly light. Key information for them is where to find the report, what the data on the report actually means and what buttons to press.

    The Creator

    This type of user is actually one of the most important in terms of organisational impact. These are the people that are tasked with generating content for the Actors, and as such have a deep understanding of the data in their domain. These are the people tasked to work with the technology experts to build out the data models that drive much of the self service capability.

    These users really get into the guts of the data and understand it in depth. When an Actor asks for explanation on a particular data point they are the ones that have to investigate. The technical training they need focuses on content creation and publishing. The enablement needs to cover things like business analysis skills, Master Data Management and Data Cataloguing.

    The Analyst

    Most people calling themselves PowerBI experts sit here, and most organisations have a handful of them – they are not a big population (though a vocal one!). They sometimes fill the role of creator but more often than not are trying to make sense of the organisations data, how it interlinks and where the hidden treasure lies. They “self-serve” from the raw data, constructing new ways to get insight into the organisations performance.

    Here enablement focuses on technical capability as they need to understand how to manipulate and interrelate data that may be in less than ideal forms and certainly hasn’t been through an enterprise cleaning process. Data Catalogues support them in the discovery process.

    To wrap

    The key message to take from this post is that when rolling out PowerBI at scale these different communities and capabilities need to be addressed – and in different ways. There is no value in sending all of your team on advanced PowerBI training if the skills learned will never find a practical application. Similarly if you build it, they won’t come – you need to guide them there.

    Good luck!

    (Adapted from an original LinkedIn article by the author)

    Azure DevOps Power BI Reporting

    By | Data & AI | No Comments

    Azure DevOps (ADO) is fast becoming the application lifecycle management tool of choice for modern organisations. With boards, CI/CD pipelines and Git repo capabilities, Agile practices have never been so easy to implement in project management. However, as a DevOps tool, it is understandably not designed and equipped to be a fully-fledged reporting and analytics solution as well. Luckily, Power BI can be used to integrate with ADO to deliver the kind of enterprise reporting that project managers need to properly monitor their projects. This blog post covers Azure DevOps Power BI reporting and also some examples of what kind of reporting is available.

    Connecting to ADO

    A connection to ADO is made possible via the OData feed option available in Power BI. Once connected, you will need to select the relevant tables to begin building a data model. For most project managers, the main objective in ADO reporting is getting clear visibility of the progress of work items. For that reason, the tables imported into the model should contain information relating to Work Items and Iterations.

    Once imported, some simple transformations are required to clean the data. It is crucial that the correct relationships are created between work item tables. This is because work items in ADO are hierarchical, starting with Epics, Features, User Stories and Tasks. This hierarchical logic must be captured in the model in order for the reporting to make sense.

    Once the model has been created, the report visualisations can be built. Based on experience, a clearly constructed table outlining key work item information including Sprint, Epic, Feature, User Story, Task, Assigned To, Completed Date, Task Number and Sprint Percentage is precisely what project managers want to see. Although not the most visually compelling report, these tables  clearly articulate work progress in a single view, something not easily achieved natively within ADO.

    What Reporting Is Available

    As mentioned previously, ADO reporting is primarily concerned with reporting on the progress of work items. However, the OData feed is able to capture most of the ADO backend, meaning that additional reporting on things such as pipelines and test results is also possible. Some typical reporting examples include:

    • Sprint progress reporting
    • Resource burndown and capacity
    • Work item cycle time
    • Work item predictability and productivity
    • Task completion forecasting
    • Work item distribution
    • CI/CD pipeline failures
    • Application testing and release results

    Virtually any reporting can be custom built using Power BI and the OData feed.

    Developing Power BI reporting for ADO is also useful because of its scalability. The OData feed can be re-pointed to any ADO instance, meaning that your reporting can be easily reproduced in other ADO instances. At FTS, we offer a pre-packaged report that can be easily implemented in any instance. It contains the most relevant reporting out of ADO based on our experience and has been very useful for us in managing our projects.

    Finally, Power BI reports can be easily embedded back into ADO via the native web-embed functionality. A dashboard in DevOps must be first created, and include an iframe dashboard widget. Then the Power BI report can be embedded into the widget in the dashboard, thereby allowing you to view your custom reporting within the DevOps browser. This ability to embed reporting elevates ADO into being an all-in-one development management tool, greatly assisting project managers keep track of resources and progress.

    If you want to begin Azure DevOps Power BI reporting, please contact us for more information.

    Power BI Dataflows: New and Improved

    By | Data & AI | No Comments

    If you attended one of our Dashboard In A Day events earlier this year, you would have seen a brief demonstration of Power BI Dataflows, and what they can mean for an organisation. With the recent Microsoft update of Dataflows, now is a good time to familiarise yourself with this feature and learn how you can leverage it to improve the data culture in your organisation.

    What Are Dataflows

    If you have worked with Power BI before, then you are familiar with Power Query, which is the tool used for extracting, transforming, and loading data into a data model. Power Query allows you to connect to a variety of data sources and perform detailed transformations to manipulate data into the desired format needed to perform analysis.

    Dataflows is an extension of this, in that it allows you to create these Power Query transformations and make them available across your organisations for repeatable use. This is important for two reasons:

    1. It scales data preparation, and eliminates the need for users to perform transformations again and again.
    2. It introduces a layer of governance in centralising and standardising data preparation assets.

    Dataflows gives users access to clean, transformed data that they can rely on and re-use. This is vital in supporting self-service analytics in an organisation, as it provides users with the platform needed to access reliable and pre-configured data assets.

    New Capabilities

    Power BI has now introduced Endorsement capabilities into the Dataflows feature. Dataset endorsement capabilities have already been in use for some time and have proven very useful in establishing quality data culture in an organisation. With this capability now extended into Dataflows, quality data assets can now be more easily identified and shared across an organisation. Per the Endorsement principles, Dataflows can be marked for Promotion or Certification.

    Promotion – tells users that the dataflow owner believes that this dataflow is good enough to be shared and reused. Users will need to have confidence in the dataflow owner to trust the quality of the dataflow.

    Certification – tells users the dataflow has passed internal tests for quality per organisational policy. Only specified users are authorised to mark Dataflows as Certified.

    Certified and Promoted Dataflows are marked with badges when users attempt to connect to them in Power BI Desktop:

    This identification means that users can easily see which dataflows they should use to connect to when preparing reporting or analysis.

    Why It’s Important

    Endorsement is an important step in making Dataflows an enterprise-ready feature. With endorsement, an organisation must adopt a policy for reviewing and certifying data preparation assets. The introduction of this policy greatly improves the quality of data in an organisation, as only certified dataflows are being used for reporting and analytics outcomes.

    Organisations that wish to promote a self-service environment will also benefit greatly from endorsed dataflows, as it reduces the need for dedicated resources to create and control data access. Instead, quality data assets can be centrally managed via Power BI and made available to the organisation to connect to and use. Users can rely on the quality of data, and do not need to perform any additional tasks to cleanse the data to get it ready for their analysis.

    If you want to know how Dataflows can be used in your organisation, please contact us for more information.