Skip to main content
All Posts By

James Beresford

PowerBI Partner Showcase is published

By Data Visualisation No Comments

We are excited to announce the publication of our first PowerBI Partner Showcase! The Government Contract Analysis Tool is a demonstration of our ability to deliver powerful data visualisation solutions. It’s a great piece of work by our team and a big recognition of their capability.

Power BI Partner Showcase - Government Contract Analysis Tool

Power BI Partner Showcase – Government Contract Analysis Tool


Based on Open Data this PowerBI Partner Showcase demonstrates our ability to visualise data to provide insight on Federal Government Spending patterns – check it out on the official PowerBI site. It’s interesting to see how many contracts are awarded in June. This is visible in the bottom chart of the first page. In 2002 a whopping 83% of contract were awarded in June. Surprisingly, not an election year.

The Government Contract Analysis Tool

So what purpose does it serve? Government contracting is a vast industry worth billions of dollars with thousands of contracts awarded every year. Such large data presents difficulties in offering useful analysis, resulting in the commercial trends going unnoticed.

Typical questions are:

  • Who are my major competitors for this service?
  • What procurement method should I tender with?
  • When is the best time to tender for work?
  • Where are the most contracts being awarded within Australia?
  • How long will I need to commit for a contract with this agency?
  • What growth has there been for contract values across agencies over time?
  • How much should I tender for?

To remain competitive and drive revenue growth, decision makers need accurate and real-time answers to the above questions. Unfortunately, direct and constructive analysis is difficult with the vast quantity and low quality of the data provided.

FTS Data & AI took the raw data and crafted an effective analysis tool that provides answers to the critical questions that decision makers demand.

Users can now:

  • Identify who are their major competitors, and how much they have earned in contract revenue, thus helping shape price and value propositions.
  • Determine what procurement method is preferred for each service and agency, reallocating resources and refining tender strategy to better suit the favoured method.
  • See when most contracts are awarded in a financial year, and how much the average value is for that contract, allowing ample time to plan executable strategy.
  • Locate where most contracts are being awarded in Australia, enabling users to lift and shift resources into areas with greater potential for work.
  • Track how long different agencies and services are demanding contracts span, giving users the ability to budget time and associated costs of work.
  • Recognise trends in growth of contract value for each agency and service, empowering decisions around service offerings and prices.
  • Monitor market prices for each agency and service, sliced across numerous fields, informing users of various price pressures and trends.

Using Power BI, FTS Data & AI has transformed a low-quality dataset into a fully-interactive reporting tool capable of providing clear and concise answers to the questions that decision makers are tasked with solving to ensure continued business success.


Getting Started with Chatbots

By AI & ML No Comments

Most retail websites have a chat channel these days and more often than not, there isn’t a human being on the other end. A trained computer program, i.e. a chatbot, performs the mundane job of answering repetitive questions and never gets tired of doing it. In some cases, the chatbot performs tasks that would’ve been time-consuming for a human being in a matter of seconds. And this experience is only going to get richer for the user over time.

Learning to Walk before Running

Organisations that are looking to leverage chatbots to bring efficiencies into their customer-centric processes can gain valuable expertise by first building an inward-facing chatbot that assists their staff. By building a chatbot that employees can communicate with, the organisation can provide a valuable service to its staff and in the process, get a detailed understanding of the methodologies & tools required to build a productive chatbot. These learnings can then be applied to chatbots that are made available to customers.

Getting Started

The primary use case for building a bot is automating repetitive manual tasks. In the case of a chatbot, a good use case is to help in answering questions that a user would usually search in a published document. Most organisations have an internal Wiki page or a corporate policy document which staff needs to manually trawl through periodically to get answers to specific questions. Getting a chatbot to simplify this process and making it efficient can help improve staff productivity.

The technical services and tools required to build a chatbot are now mature. Microsoft’s Azure Bot Service facilitates building & deploying a chatbot and integrating it with knowledge bases stored in cognitive services such as QnA Maker. Once the chatbot has been published, it can be integrated with chat channels such as Skype for Business & Teams.

The Next Steps

Once a chatbot that can answer questions from a knowledge base has been built, it can be made more intelligent by integrating it with cognitive services such as LUIS (Language understanding intelligent service). This makes the chatbot responsive to actual intents deciphered from the conversation. The models that power these cognitive services are constantly learning, thereby making the chatbots more responsive over time.

Once an organisation successfully implements such an inward-facing chatbot, building a customer-facing chatbot becomes a natural extension. The organisation can then look to implement more complex process flows & integrations with internal systems such as CRMs to improve the overall user experience.

Our Experience

At FTS Data & AI, we practice what we preach. We’ve developed a chatbot named ‘fts-bot’ which we’ve integrated with our Teams chat channel. The fts-bot can answer questions from FTS’s employee handbook thereby eliminating the need for staff to manually search a PDF document. Our staff, especially those who haven’t had a lot of interactions with chatbots, have found this experience productive, and we continue to receive new ideas from technical & non-technical staff.


Chatbots will become ubiquitous on the internet in the future. They will offer customers a personalised user experience and continue to learn with each interaction. Food for thought – which time-consuming process do you currently follow that could be optimised by having a chatbot assist you? Please comment.

DevOps in Database Development

By Data Platform No Comments

When we speak of applications development today, we assume DevOps is an integral part of the software development cycle. Modern microservices-based architectures facilitate the use of DevOps and the benefits of this are well known – agile development, quicker defect resolution, better collaboration, etc. Through containerisation using platforms such as Docker and container orchestrators such as Kubernetes and DC/OS, continuous integration and deployment become essential and not optional steps in daily activities. PaaS offerings in Microsoft Azure like AKS (Azure Kubernetes Service) make management of the platforms even simpler and thereby encourage uptake.

However, while DevOps practices have become mature in the applications development sphere, the same cannot be said when it comes to database development. To be able to build a true DataOps team that can integrate agile engineering processes encompassing IT and data teams, a DevOps mindset is essential. Many large enterprises as well as small organisations continue to follow age-old practices for developing data-related artefacts and as a result, we still see a lack of agility and at times, poor quality.

Microsoft has invested heavily to ensure that database developers can also leverage the benefits that have been reaped by application developers. Today’s SQL Server development IDE, SQL Server Data Tools (SSDT), comes loaded with features that enable a development team to collaborate and follow good programming practices. When combined with Visual Studio Team Services (VSTS), we get the environment needed to engender a DevOps-focused development culture.

Six Steps to DevOps

At Talos, we believe DevOps is a foundational step in ensuring high-quality outcomes for our clients as part of a Modern Data Platform. Therefore, we make use of the toolsets made available by Microsoft in our development activities and adhere to strict policies, which are enforced by the tools. If you are looking to enable a similar culture in your database development team, consider the following guidelines –

  1. Version Control – Use a distributed version control system like Git for your database code. Git is ingrained in SSDT and VSTS, and for those who prefer the command line, Git can be used in a PowerShell window. Once you’ve set-up a VSTS environment, make use of a SQL Server database project in SSDT for your database development and sync it with Git.
  2. Branching Strategy – Start with a simple branching strategy in Git. There is no one-size-fits-all approach for this, so you’ll need to pick a strategy based on the complexity of the project and the size of the team. As an example, in addition to the master branch, create a dev branch and have the development team work of this branch. Create pull requests to merge the changes into the master branch. Ensure that the master branch is always stable.
  3. Development Environment – Consider making use of SQL Server 2017 hosted on Linux in Docker as a development instance. The containerised SQL Server instance is quick to boot, tear down & replace. PowerShell can be used to issue docker commands, or Kitematic can be used if the preference is for a GUI. 
  4. Continuous Integration – VSTS can be configured for automated builds which can be triggered when changes are committed. Configure continuous integration on the dev branch to ensure that the database builds successfully on every commit. 
  5. Continuous Deployment – Automate publishing changes to QA environment. This will allow testing to commence as soon as changes are committed successfully. When the process becomes mature, deployment to production can also be automated.
  6. Policies – Ensure access to the branches is only given to those who need it. Apply strict policies such as requiring a successful build as a prerequisite for a pull request to succeed. Automatically include code reviewers who would need to approve the changes before pull requests can be completed.

These initial steps will ease the team into the DevOps culture. Look to get these steps right before moving to more advanced areas like automated unit testing, NuGet packaging, coupling database with application changes, etc.

Through the use of a combination of mature tools and strict practices, a DevOps pipeline for database-related development activities is no longer a pipe dream. As MapR’s Chief Architect Ted Dunning has predicted, a sophisticated DataOps team comprising of data-focused developers and data scientists will be the way of the future (MapR press release). Sound DevOps practices will be the first step towards getting there.