SQL server will by default write failed login attempts to the database error log. It's not only important to review failed login attempts for security reasons but also they are sometimes a symptom of another problem.

One of the problems of log analysis is the overwhelming volume of information. Its time consuming to sit and read through the log entry by entry. Lucky we have a plethora of tools to assist.

The following post will provide a solution for parsing, and aggregating failed attempts from the error log using T-SQL.

Here's the script for querying and parsing the database error log.



 
/* //////////////////////////////////////////// 1: Varaible Table Declarations //////////////////////////////////////////// */

 
 DECLARE @ErrorFiles AS TABLE
    (
      subdirectory NVARCHAR(4000) NOT NULL ,
      DEPTH BIGINT NOT NULL ,
      [FILE] BIGINT NOT NULL
    );
 
 DECLARE @ErrorLogSummary AS TABLE
    (
      LogDate DATETIME NOT NULL ,
      ProcessInfo NVARCHAR(100) NOT NULL ,
      [Text] NVARCHAR(4000) NOT NULL
    );


/* //////////////////////////////////////////// 2: Get number of error logs /////////////////////////////////////////////// */
 


 DECLARE @ErrorLog NVARCHAR(4000), @ErrorLogPath NVARCHAR(4000);
 SET @ErrorLog =  CAST(SERVERPROPERTY(N'errorlogfilename') AS NVARCHAR(4000));
 SET @ErrorLogPath = SUBSTRING(@ErrorLog, 1, LEN(@ErrorLog) - CHARINDEX(N'\', REVERSE(@ErrorLog))) + N'\';
 
 INSERT INTO @ErrorFiles
 EXEC xp_dirtree @ErrorLogPath, 0, 1;
 
 DECLARE @NumberOfLogfiles TINYINT;
 SET @NumberOfLogfiles = (SELECT COUNT(*) FROM @ErrorFiles WHERE subdirectory LIKE N'ERRORLOG%');

IF @NumberOfLogfiles <> 0
SET @NumberOfLogfiles = @NumberOfLogfiles - 1


 /* //////////////////////////////////////////// 3: Extract Log data ///////////////////////////////////////////////////// */
 
  
DECLARE @x TINYINT = 0
WHILE @x < @NumberOfLogfiles  --Set based on number of Error logs found
BEGIN

 --Loop through all logs looking for the strings Login and failed

INSERT INTO @ErrorLogSummary
EXEC sys.sp_readerrorlog @x, 1,'Login','failed' 
 
 SET @x =@x + 1
END


 /* //////////////////////////////////////////// 4: Parse Log   ///////////////////////////////////////////////////// */
 DECLARE @SearchString VARCHAR(22);
 
 SET @SearchString = 'User';

  --Extract User and Client IP from log records
 ;
 WITH   Login_CTE
          AS ( SELECT   SUBSTRING(Text, CHARINDEX('[client:', Text, 0) + 9,
                                  CHARINDEX(']', Text,
                                            CHARINDEX('[client:', Text, 0))
                                  - CHARINDEX('[client:', Text, 0) - 9) AS Client ,
                        CASE WHEN [Text] LIKE 'Login failed for user%'
                                  AND Text LIKE '%Reason%'
                             THEN SUBSTRING([Text],
                                            CHARINDEX(@SearchString, Text, 0)
                                            + LEN(@SearchString) + 2,
                                            CHARINDEX('Reason', Text, 0)
                                            - CHARINDEX('user', Text, 0) - 9)
                             WHEN [Text] LIKE '%untrusted domain%'
                             THEN 'untrusted domain'
                             WHEN [Text] LIKE '%Only administrators%'
                             THEN 'admin downtime'
                             ELSE 'User Named Not Parsed'
                        END AS UserName ,
                        LogDate
               FROM     @ErrorLogSummary
             )



 /* //////////////////////////////////////////// 5: Aggregate Results   ///////////////////////////////////////////////////// */
 

 --Aggregate
 SELECT UserName,Client,COUNT(1) AS FailedAttempts, MAX(LogDate) AS LastFailedAttempt, MIN(LogDate) AS FirstFailedAttempt
 FROM Login_CTE
 GROUP BY UserName, Client
 ORDER BY (CASE WHEN UserName = 'sa' THEN 2147483647 ELSE COUNT(1) END) DESC  --Prioritize By SA failed attempts

This can be ran ad-hoc, or as part of SQL agent job. Create two jobs - one to store the data in archive table and one to delete the rows after specified period e.g. all data older then six months.

I prefer this solution but others exist. For sensitive or high risk servers you may want to consider real time alerting. These tools are helpful for addressing that need:

  • Extended Events
  • Server Audit
  • Custom Powershell script
  • Third Party Tools