Commercial Work

The following is a short list of some of the work I have completed to date.

All reports developed are dynamic with parameter options.

Words within parentheses () highlight skills used.

A list of college modules completed can be viewed here (LINK).

Key Terms (Azure Data Factory, Fact, Dimension, DBA, Data Warehouse, Data Model, Batch Scripting, Excel, VBA, T-SQL, SSMS, SSRS, SSIS, SSAS, Domain Knowledge, Business Logic, PowerPivot, Powershell, Fraud Detection, Stored Procedures, UX, UI, Graphing, Sharepoint, DAX, Access, SQL Server Agent Jobs, ETL, Reporting, Validation, Automation, User Defined Function, Visual Studio, Version Control, Database Development, Team Foundation Server, Maintenance Plans, HTML, CSS, Database Mail, Risk, Health, Tableau, Java Script, Ola Hallengren, TLog, VLF, Backups, Fragmentation, Index, Service Pack, Security, GDPR, Compliance, Microsoft, Planning, Migration, Monitoring, Maintenance, Architecture, Security, Active Directory, Emergencies)

Projects:

Company: One of the Big Four accountancy firms

 

P1. (Batch Scripting)

The development and introduction of a file merging/appending batch script which reduced the workload of analysis and file cleaning.

P2. (EXCEL, VBA)

The optimisation and automation of the data cleaning process in excel utilising numerous functions created with VBA.

P3. (EXCEL, VBA)

The development and introduction of a record linkage function in excel for file comparison utilising vba.

P4. (T-SQL, SSMS)

SQL Server 2008 database developed to store and deduplicate security data e.g. common stock.

P5. (T-SQL, Domain Knowledge)

Accountancy formulas translated into T-SQL to automate the auditing process within a database.

P6. (EXCEL, VBA, Business Logic, Domain Knowledge, Fraud Detection)

Backwards engineered (Computer-aided audit tools) excel add-ins copying UI and functions which enabled the following:

  • Data stratification.
  • Sample extractions.
  • Missing sequence identification.
  • Statistical analysis.

P7. (EXCEL, VBA, Domain Knowledge, Business Logic)

Excel function developed to automate the creation of accounts by allowing an audit senior to tag entries in a company trial balance. The summation of these accounts would then automatically create the profit and loss account. This function represented a massive saving on time while reducing labour requirements and improving quality and accuracy.

Company: Consulting work for one of the largest blue chip software companies in the world

P1. (Excel, T-SQL, Stored Procedures, DAX, PowerPivot, SSAS)

Excel report created showing a quarter on quarter comparison of the top 25 business customers. This report could be sliced geographically, by area and by regional operation centre. This report was requested by the treasurer of the company and was part of the financial deck for the company directors.

P2. (Excel, T-SQL, Dataset, Business Logic, PowerPivot, Fraud Detection)

Email validation report which categories customer email addresses and flags those categorized as potential risks.

Criteria of categorization:

  • Email addresses that contain the company name or a subsidiary name while conforming to a valid email address format were excluded.
  • Anything with gmail, yahoo, Hotmail was flag as private email.
  • A check was performed to test if the company name (or the initials of a company name) were in the email address.
  • Duplicate domains were assessed.
  • Known email address exceptions, i.e. were the company name is not included in the email address, were accounted for.

The flagged emails would then be assessed manually

A report was prepared with the flagged email were progress could be tracked as employees investigated the addresses and updated the report.

P3. (SSRS, T-SQL)

An SSRS report flagging accounts receivable non perfect match clearances report.

P4. (T-SQL, EXCEL)

An ad hoc excel report showing the number of invoices per month sent to volume licensing partners. This was part of an analysis to determine how much money could be saved by consolidating the number of invoices sent globally.

P5. (SSRS, T-SQL, PowerPivot)

A report was created that was called the Forward aged history log which allowed the user to step back to the first day of previous months and forward age the accounts receivable records.

P6. (EXCEL, VBA)

Due to a quirk in a preexisting sharepoint application the invoice write off data it produced required reworking before into could be loaded into another application. The solution was to provide the end user with a VBA macro that could establish which rows contained cells that had entire records written into one cell. The macro could then split out the contents of those cells into distinct rows.

P7. (SSRS, Stored Procedures)

An SSRS report which highlighted uncleared items based on the clearing date field status. Parameters included period, area and regional operation centre.

P8. (EXCEL, T-SQL, Stored Procedures, Sharepoint, UX, UI, Graphing, PowerPivot, DAX, SSAS)

An excel report showing contacts logged with debtors on a daily, weekly and monthly basis.

Assesses:

  • Number of customers with open AR
  • Number of customers contacted
  • Number of manual comments logged
  • Percentage of customers contacted

Graphically Trends:

  • Number of contacts made per month over the previous year

Graphically Displays:

By way of a bar chart

  • Account Reconciliation Action
  • Email
  • Formal Letter Sent
  • Internal Call/Email
  • Internal Notification Sent
  • Outbound Call
  • Visit to Customer

P10. (T-SQL, EXCEL, DAX, Sharepoint, SSIS, SSAS, UX, UI, Business Logic, Domain Knowledge, PowerPivot, DAX, MDX, Data Warehouse)

The primary SLA (Service-level agreement) was the collection of accounts receivable. This required a frequently updated report demonstrating current and historic collection progress.

This was a very significant report aggregating millions of points of data with high visibility across the client company serving dozens of managers and hundreds of employees globally. It was the main deliverable of a consolidation project to automate, standardise and globalise reports that had existed internationally. A data pipeline was necessary to ETL the required data as millions of rows of data were pulled from several locations globally. This data was transformed and stored in a Kimball data warehouse as the data needed to be aggregated into fact tables and stored for year on year and quarter on quarter comparison. The population process needed to be run three times a day to stay in sync with the three different time zones the collection services operated across.

The SLA report had over 70 fields. Due to its global scope it was necessary to keep it as generic as possible allowing end users to further refine the report by their department, area, country, regional operation centre with the use of 10 slicers.

P11. (Access, SQL, EXCEL)

The development of a department wide time management database developed in SQL Server and used to record employee absence, whether that was through illness, planned holiday or training time etc. This database was developed as it was a prerequisite to developing a capacity report.

Company: Electricity supplier

P1. (SSIS, SQL Server Agent Jobs, ETL)

Reconciling the electricity supply board summary messages against the live individual messages received.

Process involved:

  • Creating tables to house data
  • Creating an SSIS job to reshape daily coded messages
  • De-duplicating data
  • Removing duplicates
  • Highlighting missing files

Deliverable: an updated and accurate account of customer status history by way of reconciled data.

P2. (Stored Procedures, EXCEL, VBA, Reporting, UX, UI)

Updated and automated funnel reporting summary which deals with communication with customers:

Specifically;

  • Contact rates
  • Win rates once contacted
  • Registration cancellations with old suppliers

The display of this report showed a graphical break down of the methods of contact i.e. door to door, retention team and landlord accounts over time.

Report formatting was introduced transforming the excel file from a bare bones spreadsheet to a sudo custom front end with the company branded colours and logo.

Automation was achieved by creating a stored procedure on the database providing the data. No external connections were created as part of the file as the connection and update are achieved using VBA. This adds an extra layer of security as the VBA access is locked.

P3. (SSIS, SQL Server Agent Jobs)

Creation of an ODS using SSIS packages. This was the preliminary step in creating a common data warehouse shared between two separate branches of the business.

P4. (SQL Server Agent Jobs, Stored Procedures, ETL, EXCEL, Validation)

Third party vendors supply data in the form of excel files regarding installations that need to be incorporated into the reporting database for cross referencing as well as validation and reconciliation.

Before this could be achieved the excel file templates needed to be updated to include validation rules.

Once this was completed a job was created to pull the file from a predetermined destination into the database were logic checks and validations are performed before the data is loaded.

P5. (Stored Procedures, Reporting, SSRS)

Call and web lead reports created. Two regularly run SQL queries were turned into two stored procedures which are used by two SSRS Reports to track marketing leads from websites and direct calls from customers. These communications tie back to specific marketing campaigns.

P6. (Stored Procedures, SQL logic)

A stored procedure was modified to update the logic which checks whether electricity supple board message files already exist within the database. If a file does not the file is loaded else the file is rejected.

P7. (Stored Procedures, SSRS, Business Logic)

The creation of a stored procedure and report which tracks the success/failure of the retention team in winning back or losing a customer to a competitor. This report tracks the customer details like the customer name, whether they were won/lost and the competitor they were lost to.

P8. (SSRS, UX, UI)

Created the standard company wide report template to be used in all SSRS reports.

P10. (Stored Procedures, Reporting, SSRS)

Created a stored procedure and report to highlight new joins for the intention of sending them a welcome email.

P11. (SSMS, Automation)

Using the SQL SERVER 2008 R2 Reporting Services Configuration Manager to setup a ReportServer Database for use in sending SSRS reports via email automatically.

P12.(Stored Procedures, Reporting, SSRS)

Created a stored procedure and report to highlight customers who have received promotions, what these promotion were for and a summation of their monetary value.

P13.(Stored Procedures, Reporting, SSRS)

Updated a legacy report which shows the success rate of agents and contact methods with new metrics.

P14. (Stored Procedures, Reporting, SSRS)

Updated regularly run queries that generate call and sales data for a daily report and converted these queries into Stored Procedures for use in two SSRS Reports.

P15. (T-SQL, User Defined Function)

Create function to convert UTC to local GMT taking account of Daylight savings time.

P16. (T-SQL, User Defined Function)

Convert a stored procedure into a function assigning calls to the correct queue based call type and menu press.

 P17. (T-SQL, Database Development, Stored Procedures, Business Logic)

Created the database, tables and stored procedures to capture live call data from the in2Tel system.

This relies on the in2Tel call management system sending call posts which need to be intercepted and stored in the database. Each call has 5 posts, when the fifth has been received the call has been completed.

This process uses a web service created in C# to intercept the posts and then call the Stored Procedures to populate the table.

P18. (Stored Procedures, Reporting, SSRS, UX, UI)

Leader board display for sales agents.

An SSRS report set to refresh every 5 minutes to be displayed on a large monitor visible to everyone in the sales department. The leader board displays the sales agent’s names, attributed number of live calls and sales as well as conversion rates.

When a user does not enter a date range by default the display shows todays range only. When a user enters a date range a period column is included in the report showing sales for every day in the date range. Allow the report to work historically.

P19. (SSRS, Deployment, Version Control, Database Development, Team Foundation Server: Visual Studio)

Created and updated existing tables and scripts to support the release of a mobile app which ties into the companies back end. Focus was on the communication aspect of the app, email etc.

P20. (Stored Procedures, SSRS, Reporting, Business Logic)

Create call transferred SQL logic and incorporated it into existing reports.

P21. (T-SQL, Reporting, Business Logic)

Identification of incorrect call transfers based on call type arrival and final call type destination.

P22. (T-SQL, Reporting, Business Logic, Automation, Fraud Detection)

High visibility project with a main focus on creating a workflow engine to be used for revenue protection by detecting fraud.

  • The main deliverable of this project were to create a pipeline to crush large volumes of data to prepare a condensed data set.
  • Defining the business rules of the workflow engine to detect fraud.
  • Creating the workflow engine itself.
  • Automating actions such as emails, sms messages etc. to be triggered based on the analysis of the workflow engine.

P23. (T-SQL, Batch Scripting)

Created batch script to:

Combine dozens of insert SQL scripts with thousands of rows.

Split the combined script of values into insert batches.

 P24. (T-SQL, EXCEL Formulas)

Created customer load profile table and populated it with data provided in excel files. Required the use of excel formulas to create dynamic insert statements.

 P25. (T-SQL, Stored Procedures, Optimisation)

Optimised poorly written legacy stored procedure, execution time dropped from 30 minutes to 10 seconds

 P26. (SSRS, Reporting, Stored Procedure, Business Logic)

Conversion of an excel report used by the CEO regarding customer acquisitions into an SSRS report.

P27. (Stored Procedure, CRM)

Created stored procedure that will allow users to push through promotions using the CRM were promotions have failed due to users providing incorrect data, i.e. mobile, email address.

P28. (Stored Procedure)

Created the Stored Procedures needed to return the following fields regarding customer top-up data:

  • Date Created
  • Top-up type e.g. (web, promotion, android/apple)
  • Number of top-ups
  • Top up amount

All of which grouped by day.

  • Date of new registrants
  • Number of new registrants

P29. (T-SQL, Database Development, Stored Procedures, Business Logic, UX, UI, Optimisation, Data Warehouse)

Replaced the sales report, updated daily for the CEO and directors, from an excel file to an SSRS report, upgrading it to be a more encompassing and dynamic report covering all of the relevant company KPI. The suite of reports showed department data on a monthly basis which can be drilled down by week, day.

The upgrade required a much more extensive backend. An Inmon data warehouse was created with entities such as Call, Sale, Agent, Customer etc. The data warehouse pulled from an ODS which had been created by me as a prior project.

P30. (T-SQL, Database Development, Stored Procedures, Business Logic)

New Stored Procedure to allow front-end end users to update a customers email and phone number.

P31. (T-SQL, Database Development, Stored Procedures, Business Logic)

New Stored Procedures to create aggregated views of customer top up history and new registrants.

P32. (T-SQL, Database Development, Stored Procedures, Business Logic, UX, UI, VBA, EXCEL, PowerPivot)

New dynamic, self-serving report for CEO that shows new customer, customer loss, net gain and graphs progress for the year. Report creates a connection via vba to the database from excel and executes stored procedures.

P33. (T-SQL, SSRS, SSMS, Stored Procedures, Business Logic, UX, UI)

Created report “Completion Rate Report Specification”. This report is designed to show, for specified campaign groupings, the current outcome of sales that occurred within a specified time frame.

Provides an agent level completion rate with dynamic conversion metrics.

P34. (T-SQL, SSRS, SSMS, Stored Procedures, Business Logic, UX, UI)

Created report “Reschedule Rate Report Specification”. This report was designed to show the current status of every account that was left at the status of ‘In Progress’ for a given period time.

P35. (T-SQL, SSRS, SSMS, Stored Procedures, Business Logic, UX, UI)

Created report “Cancellation Rate Report Specification”. This report is designed to show, for specified campaign groupings, the cancellation reasons that occurred within a specified time frame.

Company: Insurance Broker

P1. (T-SQL, SSMS, Stored Procedures, Schema, Tables, Business Logic, Data Model, Data Warehouse)

The development of a Inmon Data Warehouse which included all the necessary stored procedures, tables and logic to classify entities such as Policies, Portfolios, Calls and Sales. This allowed for analysis reports on Calls and Sales etc. to be generated.

P2. (T-SQL, Stored Procedures, Optimisation, Business Logic)

The optimisation of the logic that is used to classify that a policy is in arrears.

P3. (Archiving, T-SQL, Stored Procedures, Optimisation, Business Logic)

Archiving process introduced to optimise the customer renewals invite logic.

P4. (DBA, Archiving, T-SQL, Stored Procedures, Database Mail, HTML, CSS)

Implemented Database monitoring by creating the database Administration Assessment Database (aadb). This database can observe and alert on the following:

  • User Activity
  • Database Growth
  • Drive Space
  • Read Write Stats
  • Table Stats
  • Wait Stats
  • TempDb Activity
  • TempDb Resource Utilisation
  • DDL Schema Changes

On going project.

P5. (DBA, Optimisation, SSMS, Maintenance plan, Backups)

Reorganised the backup maintenance plans for the sake of optimisation and to reduce the competition for resources. Removed unnecessary backup files saving terabytes in space.

P6. (DBA, Optimisation, Risk, Health)

Working with Microsoft engineers I brought two production Servers through the Microsoft Risk Assessment Program (RAP) which covers Risk, Health and Optimisation issues.

P7. (DBA, Optimisation)

Optimise query to generate sales information for product and pricing.

P8. (Tableau, HTML, CSS, Java Script)

Created custom web page template that auto refreshes Tableau views.

P9. (T-SQL, Stored Procedures, Optimisation, Business Logic)

Introduced new process so new campaigns would not need to be introduced to existing business logic manually.

P10. (T-SQL, Stored Procedures, Optimisation, Business Logic)

Created a dataset that allows the company to determine the categorisation of customers they have on their books.

P11. (T-SQL, Batch Scripting, SQL Server, DBA)

Compensate for malfunctioning and obsolete Tableau installation. This process requires a server scheduled task running a batch script that restarts the tableau server and writes an output file to the C drive when completed. This is necessary as tableau generates objects that are not being dropped in the associated SQL Server tempdb. Restarting Tableau services closes the connection with SQL Server allowing the objects to be dropped.

P12. (T-SQL, SQL Server, DBA)

Created robust foolproof script that auto generates database restore scripts.

P13. (DBA, Optimisation)

Optimised schedule for backups over the network by staggering the SQL Server production servers backups. This reduced the time taken to complete the backup process.

P14. (DBA, SQL Server, SSMS, T-SQL)

Introduced SQL Server System Alerts which email staff when encountered.

P15. (DBA, SQL Server, SSMS, T-SQL, Security)

Introduced SQL Server Security Login Audit for failed and successful logins.

P16. (DBA, SQL Server, SSMS, T-SQL)

Created stored procedure to get job history.

P17. (DBA, SQL Server, SSMS, T-SQL)

Created script to help identify obsolete tables providing information on table size, indexes, last modified and last interacted with.

P18. (DBA, SQL Server, SSMS, T-SQL, Excel)

Created excel file called “T-SQL Assessor” that will aid in the assessment of long and confusing T-Sql.

P19. (DBA, SQL Server, SSMS, T-SQL, Ola Hallengren, Fragmentation, Index)

Introduced the Ola Hallengren maintenance and backup scripts across the database environment.

P20. (DBA, SQL Server, T-SQL, TLog, VLF)

Created script to balance out Tlog Sizes on and reducing virtual log files (vlf).

P21. (DBA, SQL Server, T-SQL)

Introduce permanent hands off process to automatically clean history tables.

P22. (DBA, SQL Server, T-SQL, Fragmentation)

Created query to test fragmentation across instance.

P23. (DBA, SQL Server, T-SQL, Service Pack)

Implemented SQL Server Service Pack upgrades across the environment.

P24. (DBA, SQL Server, T-SQL, Domain Knowledge)

Set up the SQL Server back end for a newly created Insurance company including instance set up, web mail, performance observations, alerts, backups and maintenance, users and logins.

P25. (GDPR, Compliance, DBA, SQL Server, T-SQL, Domain Knowledge)

I am a key player in developing/implementing data retention policies and logic and reporting on GDPR compliance. (Ongoing)

  • Created the Compliance Databases and logic to capture stats on compliance
  • Assisted in the upgrading of application back ends to be GDPR compliant
  • Interrogated systems and companies as to their GDPR compliance
  • Attended GDPR meetings and advised on solutions and compliance levels
  • Dropped objects across the environments deemed obsolete or an unnecessary data risk

P26. (Microsoft, Planning, Migration, Monitoring, Maintenance, Tableau, DBA, SQL Server, T-SQL, Domain Knowledge)

Data Warehouse Migration

  • Worked with Microsoft engineers to create a plan for migration
  • Assisting with migration tasks
    • Backups and restores
    • Job migration
    • Trouble shooting
    • Server checks
    • Monitoring and Maintenance setup
  • Proposed Tableau upgrade be undertaken at the same time and assisted in upgrade

P27. (Planning, Architecture, DBA, SQL Server, T-SQL, Domain Knowledge)

Proposals

  • Proposed architectural redesigns
  • Proposed ODS using replication
  • Advised on DR site and DR recovery process
  • Proposed measures to limbo under License restrictions regarding SQL Server RAM
  • Provided support during call recording file location repointing

P28. (Security, Active Directory, DBA, SQL Server, T-SQL, Domain Knowledge)

Security

  • Regular advising on security concerns
  • Identified security problems relating to system access not conforming to best practices
  • Removed generic accounts and implemented AD security group account access
  • Assisted in the setup and transition of services to use AD service accounts

P29. (Emergencies, DBA, SQL Server, T-SQL, Domain Knowledge)

Emergency Actions

  • Prevented 3 potential SQL Server shutdown events and space grabs
  • Recovered 1 live environment crash and another corrupted live environment
  • Balanced Disk space limitations to prevent outages
  • Found issues with VM backups

Company: International Hotelier

Created processes for adding and capturing data from new Resorts in the Data warehouse.

Created Database templates for common objects, e.g. stored procedures, dimension tables, fact tables etc.

Standardized pipelines that consume dimension/fact data.

Created customer Email History slowly changing dimension.

Captured Credit Card rebates from PMS to be used in financial reconciliation processes.

Acted as a consultant during “single customer view” software acquisition meetings.

Created report highlighting valid phone/email statistics.

Created custom logic to dedupe customer data creating single customer view.

Introduced DBA scripts to disable jobs.

Set up distributed email list based on job roles rather than employee names.

Set up file exports from Information Systems.

Generated ad hoc reports regarding reservations, customer contact details.

Set up Azure Data Factory pipelines to consume Property Management System dimension/fact data.

Set up Azure Data Factory pipelines to consume HR Information System (PeopleHR) dimension/fact data.

Set up Azure Data Factory pipelines to consume POS Information System (SumUp) dimension/fact data.

Set up Azure access for new joins.

Wrote PowerShell scripts to upload data from SFTP to Azure Cloud Storage (Reducing the operational costs of data flow).

Acted as administrator for SSRS environment.

Updated Azure Data Factory pipelines to include additional columns from marketing system data (STR).

Provided ad hoc report for National Crime Association.

Acted as a consultant in hiring process, reviewing CVs, preparing interview questions and engaging in interviews.

Created RPA engineer assessment test based on CIMA exam.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s