Книга: Power BI: 3 in 1- Comprehensive Guide of Tips and Tricks to Learn the Functions of Power BI+ Simple and Effective Strategies+ Advanced Guide to Learn the Advanced Realms of Power BI
Назад: Chapter 10: Working with Power Query M Formula Language
Дальше: Conclusion

 

 

 

One of the more powerful features for helping you to comply with SOX, HIPAA, and other regulations is SQL Server Audit. However, viewing data collected with the feature is not easy. In our final chapter, I want to show you how you view your SQL Server Audit data and filter it using Power BI.

Business database teams are permanently involved in trying to comply with certain regulations, such as SOX (Sarbanes-Oxley Act, 2002) and HIPAA (Health Insurance Portability and Accountability Act, 1996) and these teams tend to make auditing one of the common parts of their strategies as a way of helping them track any threats to their data. As an example,  if a team is running SQL Server, they might use Audit as a way of logging both database and server actions.

SQL Server Audit is already in the database engine, and, from SQL Server 2016 onwards, it is freely available in every edition of SQL Server.

If your organization is a DBA (Doing Business As) and your job is to implement Audit, you will see that it is pretty easy to set up. The hardest part is in working out what user actions need to be audited, how large amounts of data might be handled for audit, and what the best tools are for monitoring the data and reviewing it.

However, important though these considerations are, for the purpose of this chapter, we’re going to focus on the final bit – monitoring the data and reviewing it. With SQL Server, you get an easy way to collect data, but you don’t get any meaningful way of working with the data; all you can do is manually review it.

Perhaps the best way is to use Power BI. With this, you can quickly create reports that give you some visual insight into your data.  Power BI isn’t really designed for the purpose of audits and alerts, not as much as some other dedicated management tools are, but it does allow you to track user behavior relatively easily. And Power BI Service and Desktop are both free to use.

I will be showing you, through a series of examples, how to use Power BI and SQL Server Audit data; we’ll look at how to set your test environment up, how to generate some audit data to work with, how to get that data into Power BI Desktop and how to come up with a report containing both visualizations and tables.

Do bear in mind that Power BI Desktop and SQL Server Audit are very powerful, and we cannot possibly cover everything they can do in this section. What I can do is offer you a basic overview of how you can use the tools together and how to get started with reviewing audit data using Power BI.

Getting your environment set up requires you to create a test database called ImportSales. This has one schema called Sales and a table called Sales.Customers. We can then use the data in the Sales.Customer table in the database called the WideWorldImporters database to populate the table. For our purposes, the audited actions are restricted to the Customers table from the database called ImportSales.

The T-SQL code can be run to create ImportSales:

USE master;

GO

DROP DATABASE IF EXISTS ImportSales;

GO

CREATE DATABASE ImportSales;

GO

USE ImportSales;

GO

CREATE SCHEMA Sales;

GO

CREATE TABLE Sales.Customers(

  CustID INT IDENTITY PRIMARY KEY,

  Customer NVARCHAR(100) NOT NULL,

  Contact NVARCHAR(50) NOT NULL,

  Email NVARCHAR(256) NULL,

  Phone NVARCHAR(20) NULL,

  Category NVARCHAR(50) NOT NULL);

GO

INSERT INTO Sales.Customers(Customer, Contact, Email, Phone, Category)

SELECT c.CustomerName, p.FullName, p.EmailAddress,

  p.PhoneNumber, cc.CustomerCategoryName

FROM WideWorldImporters.Sales.Customers c

  INNER JOIN WideWorldImporters.Application.People p

    ON c.PrimaryContactPersonID = p.PersonID

  INNER JOIN WideWorldImporters.Sales.CustomerCategories cc

    ON c.CustomerCategoryID = cc.CustomerCategoryID;

GO

If the WideWorldImporters database hasn’t been installed, use data of your own to populate the Customers table. If you want a different database and table to follow these examples, forget the T-SQL statement and use what suits your needs, so long as they are not being used in a production environment. Just make sure that any ImportSales or Customers references are replaced in the rest of the examples.

Next, you need an audit object, and this needs to be created at the SQL Server instance level, and you also need a database audit specification created at the ImportSales database level. The audit object is more of a container, and it is used for organizing the audit settings for the server and the database and for the delivery of the logs at the end. We are going to save our audit data in a local folder, but Server Audit allows you to save it to the Windows Application or Security logs.

The database audit specification needs to be created at the database level. It must be associated with an audit object, and that object must be in existence before the audit specification can be created. This specification is used to determine the actions that are going to be audited at the database level. A similar specification can also be created for the server audit for audits at the server level, but we will only use the database one for this section.

Both the object and the specification can be created using the T-SQL code below:

USE master; 

GO 

CREATE SERVER AUDIT ImportSalesAudit 

TO FILE (FILEPATH = 'C:\DataFiles\audit\'); 

GO 

ALTER SERVER AUDIT ImportSalesAudit 

WITH (STATE = ON); 

GO 

USE ImportSales; 

GO 

CREATE DATABASE AUDIT SPECIFICATION ImportSalesDbSpec

FOR SERVER AUDIT ImportSalesAudit

ADD (SCHEMA_OBJECT_CHANGE_GROUP),

ADD (SELECT, INSERT, UPDATE, DELETE

  ON Object::Sales.Customers BY public) 

WITH (STATE = ON); 

GO

The object, called ImportSalesAudit, is created using the CREATE SERVER AUDIT statement and is responsible for saving data to the folder called C:\DataFiles\Audit. Then the ALTER SERVER AUDIT statement is run, so the STATE property is set to ON.

Next, the CREATE DATABASE AUDIT SPECIFICATION statement is used to define the ImportSalesDbSpec specification, which has two ADD clauses. The first one specifies SCHEMA_OBJECT_CHANGE_GROUP action group, which is responsible for auditing all the ALTER, CREATE, and DROP statements that get issued against any of the database schema objects. As with all group actions, this has to be separately specified form the individual ones, like those on the other ADD clause.

There are four such actions  in this ADD clause:

In the second ADD clause, the ON subclause is pointing to the Customers table; this means that the four actions – INSERT, SELECT, DELETE, and UPDATE – are all specifically for that table. And, because the public login is specified by the BY subclause, the auditing is applicable to every user.

Normally, you would be auditing a lot of users and actions, but what we’ve done here is sufficient to show you the basics of reviewing audit data using Power BI.

Once the audit structure is built, the T-SQL code below can be run; this creates three test user accounts in the database called ImportSales and assigns each user with its own set of permissions:

CREATE USER User01 WITHOUT LOGIN;

GRANT ALTER, SELECT, INSERT, DELETE, UPDATE

ON OBJECT::Sales.Customers TO user01; 

GO

CREATE USER User02 WITHOUT LOGIN;

GRANT SELECT, INSERT, DELETE, UPDATE

ON OBJECT::Sales.Customers TO user02; 

GO

CREATE USER User03 WITHOUT LOGIN;

GRANT SELECT

ON OBJECT::Sales.Customers TO user03; 

GO

Logins are not given to the user accounts at the time they are created, just to keep things simpler. And, for that reason, all the permissions granted are specific to the table called Customers and access is defined like this:

The test users can be set up with their permissions, in whatever way you want; just make sure they are in place when you want the audit data generated for use in the Power BI Desktop.

Generating the audit data requires several DML (data manipulation language) and DDL (data definition language) statements to be run against the table called Statement. These must be run in the execution context of each of the user accounts, and the best way to do this is:

The DDL and DML statements you run are entirely up to you, so long as each account is tested, along with the account permissions. I ran several T-SQL statements, many of them multiple times, so that I could get a decent amount of data I started with these DML accounts and they were run under User01:

EXECUTE AS USER = 'User01';

SELECT * FROM Sales.Customers;

INSERT INTO Sales.Customers

  (Customer, Contact, Email, Phone, Category)

  VALUES('Wingtip Toys (Eugene, OR)', 'Flora Olofsson',

    '[email protected]', '(787) 555-0100', 'Gift Store');

DECLARE @LastID INT = (SELECT SCOPE_IDENTITY())

UPDATE Sales.Customers SET Category = 'Novelty Shop'

WHERE CustID = @LastID;

DELETE Sales.Customers WHERE CustID = @LastID;

REVERT;

GO

Do run the statements as many times as needed to generate the amount of audit data you want. I ran my DML statements around five times and the DDL statements a couple of times.

Once you have run the statements, run the next lot as User01 – these will add a new column into the Customers table:

EXECUTE AS USER = 'User01';

ALTER TABLE Sales.Customers

ADD Status BIT NOT NULL DEFAULT(1);

REVERT;

GO

Now, as User02, do the DML statements again, running them several times:

EXECUTE AS USER = 'User02';

SELECT * FROM Sales.Customers;

INSERT INTO Sales.Customers

  (Customer, Contact, Email, Phone, Category)

  VALUES('Tailspin Toys (Bainbridge Island, WA),' 'Kanti Kotadia,'

    '[email protected]', '(303) 555-0100', 'Gift Store');

DECLARE @LastID INT = (SELECT SCOPE_IDENTITY())

UPDATE Sales.Customers SET Category = 'Novelty Shop'

WHERE CustID = @LastID;

DELETE Sales.Customers WHERE CustID = @LastID;

REVERT;

GO

As User02, attempt to add a column to Customers using the T-SQL statement below:

EXECUTE AS USER = 'User02';

ALTER TABLE Sales.Customers

ADD LastUpdated DATETIME NOT NULL DEFAULT(GETDATE());

REVERT;

GO

You should get an error generated. User02 does not have permission to modify the definition of the table. However, be aware that, when you run several statements as one user, and one of them fails, the REVERT statement doesn’t run; you would need to run it again without the rest of the statements to make sure the execution context is closed. The better way of doing things would be to make sure the correct logic is written into the code to make sure that REVERT always runs.

Now, as User03, run the DML statements again:

EXECUTE AS USER = 'User03';

SELECT * FROM Sales.Customers;

INSERT INTO Sales.Customers

  (Customer, Contact, Email, Phone, Category)

  VALUES('Tailspin Toys (Bainbridge Island, WA),' 'Kanti Kotadia,'

    '[email protected]', '(303) 555-0100', 'Gift Store');

DECLARE @LastID INT = (SELECT SCOPE_IDENTITY())

UPDATE Sales.Customers SET Category = 'Novelty Shop'

WHERE CustID = @LastID;

DELETE Sales.Customers WHERE CustID = @LastID;

REVERT;

GO

This time, three statements will generate errors – INSERT, UPDATE, DELETE – because User03 does not have the correct permissions. As before, the REVERT statement will need to run without the other statements. The same would apply to the ALTER TABLE statement:

EXECUTE AS USER = 'User03';

ALTER TABLE Sales.Customers

ADD LastUpdated DATETIME NOT NULL DEFAULT(GETDATE());

REVERT;

Again, run the DDL and DML statements that you want, ensuring you have sufficient data for Power BI Desktop.

When you are in Power BI Desktop, and you want to connect to SQL Server, the data can be pulled from specified views, and tables or a query can be run to return the exact data needed from several views and tables. Queries also allow you to use system functions, like sys.fn_get_audit_file. This is a function that contains table values and returns the SQL Server Audit log file results.

We will use the function  in the SELECT statement below and return five things:

SELECT f.database_principal_name [User Acct],

  (CASE

    WHEN a.name = 'STATEMENT ROLLBACK' THEN 'ROLLBACK'

    ELSE a.name

  END) [User Action],

  (CASE

    WHEN f.succeeded = 1 THEN 'Succeeded'

    ELSE 'Failed'

  END) [Succeeded],

  f.statement [SQL Statement],

  f.event_time [Date/Time]

FROM sys.fn_get_audit_file

  ('C:\DataFiles\audit\ImportSalesAudit_*.sqlaudit',

   default, default) f

  INNER JOIN (SELECT DISTINCT action_id, name

  FROM sys.dm_audit_actions) a

    ON f.action_id = a.action_id

WHERE f.database_principal_name IN ('User01', 'User02', 'User03')

The function called sys.fn_get_audit_file is joined to the function called sys.dm_audit_actions, returning the full name of the action instead of an abbreviated name. The statement also ensures the results are limited to our three user accounts.

This is the statement required when you set up the connection between Power BI Desktop and SQL Server. Configuring the connection requires a new report to be created in Power BI Desktop. Once that is done, go to the Home ribbon and click on Get Data, then on SQL Server. The dialog box opens; expand it by clicking on the arrow beside Advanced Options.

Configure the connection like this:

  1. In the Server box, type in the Server instance - .\sqlsrv17a
  2. In the Database box, type the database name – ImportSales
  3. In the Data Connectivity section, choose an option – I went for DirectQuery
  4. In the SQL Statement box, input the SELECT statement

When DirectQuery is chosen, none of the data is copied or imported into Desktop. Instead, the underlying data source is queried by Power BI Desktop whenever a visualization is created or interacted with. This ensures that you are always getting the most up to date data.

However, you should be aware that, if you want your report published to Power BI service, a gateway connection must be created so that the service can get the source data. Also, be aware that it isn’t possible to create a report in Power BI Desktop with an Import SQL Server connection and a DirectQuery SQL Server connection – it must only have one of them.

Once the connection has been configured, click on OK, and a preview window opens; here, you can see a subset of the audit data.  If it all looks as it should, click on Load, and the data is made available to Power BI desktop. The data can now be used to add visualizations or tables. If you chose DirectQuery, you would only be loading the schema into Desktop, not the source data.

As I said earlier, you do not have to save your data to log files. However, if you opt to save it to the Security or the Application log, you will need to do more to extract the data from the logs and convert it to an easier format, like a .csv file.

You could, for example, export Log File Viewer or Windows Event Viewer data in SSMS – SQL Server Management Studio. However, the format won’t be easy to work with, and you might find you have far more data than you need. You could also use PowerShell to get the log data; you would need to create a script to run automatically, but you would have more control over what the output was, although it still takes a little extra work to make sure it’s right.

However you make your data available to Power BI Desktop, you must consider data security. Using the Security log might seem to be a safe approach to start with, but once the data has been exported to the files, you will face the same problems as you do when the data is sent straight to log files. The data has to be fully protected all the time, in-motion, and at-rest; there wouldn’t be any point to an audit strategy that fully complies with regulations if the data is put at risk during the audit.

Once the data is available to Desktop, you can go ahead with your reports, presenting your data in ways that provide different types of insight.

One good way of showing data is to make some of it available in table form. The data can be filtered as you want, depending on what you need for your audit strategy. You can also add slicers to filter the data in a specific way, for example, by Users, Actions, and Successes and Failures.

When a filter is applied, the data in the table is updated by Power BI, based entirely on the values you chose. Doing this gives you an easy way of accessing the different data categories without having to go through it all manually or having to generate a T-SQL statement each time you want different information

Another good way to get some quick data insights is to summarize the data using a matrix. You could, for example, add a matrix showing how many actions each user took and what those actions were. You will spot, on the matrix page, a list of user actions on the right side – one of these is called ROLLBACK; the ROLLBACK statement is required when the main statement fails, such as when there are no permissions for a certain action.

Slicers can also be added to the matrix report page, allowing you to filter the data as you need it.  The matrix is incredibly useful because it can be set up, so it drills down into your data, i.e., into how many actions were successes or failures. When you set this up, you decide the order each layer is presented in, dependent on what type of data you are using and what its hierarchical nature is.

Lots of different visualizations are supported in Power BI Desktop, and you also have the ability to import others. Make sure you only use the visualizations that help explain your data clearly. Say, for example, that your report page shows three visualization types; each one gives a different perspective into your data. You can filter the ROLLBACK actions out; that way, your visualizations will only reflect the user-initiated statements, instead of those that are response-generated.

Look at the figure above; on the left, the clustered bar chart shows the data grouped by the user, providing a total number of actions per user. The DML action indicates that all three user accounts have the same total – that will depend on how many times the statements were run.

At the top right, another clustered bar chart shows the data grouped by the user again, but, this time, showing how many successes and failures there were for each group. This way, you can see which users are trying to run statements they don’t have permission for.

At the bottom-right, you can see a donut chart. This shows the same data as is in the clustered bar chart above but from a  different perspective.  Hovering over any one of the elements shows you the total action percentage.

What all this is showing you is that you can use the same data in different visualizations to see what works and what doesn’t.

When elements are placed on one report page, they are tied together automatically by Power BI. By doing this, you can choose an element from one visualization, and that choice is reflected in the others.

Lastly, you can also add gauges, cards, KPIs (key performance indicators), and other elements to your reports. You could add a gauge that shows how many ROLLBACK statements were executed as a response to failed attempts at running T-SQL statements.

By adding a slicer, you can also see other data, for example, if each user were hitting the threshold specified. When you select a user in the slicer, you need to specify a target value. This is important because it means that target value is used to set alerts in Power BI Service – this cannot be done in Power BI Desktop – but you will require a Power BI Pro license to make use of these features. Alerts can also be set to generate regular email notifications, to KPI visuals and cards.

When you use Power BI to visualize your SQL Server audit data, you are using o e of the more powerful tools available for reviewing information efficiently and quickly. I really have only touched on the tip of the iceberg here, but you should now have a decent understanding of how to use the two together for auditing purposes. However, you choose to use them is up to you, but you must remember to protect the integrity of the audit data, be it at-rest or in-motion, and wherever it comes from.

Назад: Chapter 10: Working with Power Query M Formula Language
Дальше: Conclusion