Methodology of SQL server monitoring Essay

Q1: Describe the scheme for supervising the sql waiter public presentation.

Before acquiring in to the particulars of monitoring and optimisation, it is really of import to understand the methodological analysis of SQL Server monitoring and optimisation. Monitoring for the interest of monitoring is useless. You monitor your hardware and SQL Server executions to expect and forestall public presentation jobs. To make this, you must hold some sort of plan-a scheme that will enable you to put the right sum of clip and the right sum of resources to keep and better the public presentation of your SQL Servers.

Optimization Scheme

The scheme for monitoring and optimising SQL Server is reasonably straightforward and is made up of the undermentioned stairss:

We will write a custom essay sample on
Methodology of SQL server monitoring Essay
or any similar topic only for you
Order now
  1. Make a public presentation baseline-Without a baseline of your database waiter, it is really improbable that you will be able to do alterations to the waiter platform with complete assurance that the alterations will carry through the betterments you are looking for. A baseline contains measurings from all the systems antecedently mentioned ( system resources, SQL Server, the database, the database application, and the web ) . Specific counters and measurings are discussed subsequently in this chapter. When measuring the baseline, you may place countries that warrant immediate optimisation. If alterations are made, a new baseline must be created.
  2. Complete periodic public presentation audits-After the baseline is completed, periodic public presentation audits are performed to guarantee that public presentation has non degraded from when the baseline was created. This measure is frequently supplemented or replaced by reactive audits that are performed in response to ailments of hapless server public presentation. I prefer to be proactive and schedule the audits, but at that place will constantly be times when a reactive audit will be required because unexpected public presentation jobs arise.
  3. Make alterations and measure their impact-After acting audits, you may happen countries that require alteration. When doing these alterations, it is of import to be punctilious. As a regulation, you should non do multiple alterations at one time. Alternatively, do one or two alterations, and so measure the measurings that prompted the alterations to be made. This makes it much easier to place what alterations have the greatest impact on public presentation.
  4. Reset the baseline-After finishing all the alterations, create another baseline to mensurate future public presentation tendencies.

Q2: Why we use log file spectator for supervising sql waiter.

Log File Viewer

The Log File Viewer is an first-class tool for the screening of SQL Server and runing system logs in a erstwhile correlative position. For illustration, memory subsystem mistakes from the system log can be correlated with SQL Server mistakes, bespeaking out-of-memory conditions and leting you to insulate the job off from SQL Server. To open the Log File Viewer, spread out the Management booklet in SQL Server Management Studio, expand SQL Server Logs, right-click the log you want to see, and choice View SQL Server Log. Once the Log File Viewer is unfastened, you can take to open extra SQL Server logs and/or runing system logs by spread outing and choosing the logs you want to reexamine ( see Figure 10-2 ) . Notice that you can besides open up log files for the SQL Server Agent and Database Mail.

Figure 10-2: Log File Viewer

SQL Server and SQL Server Agent log files are closed and a new log opened every clip the several service is restarted. In a production system, this may non happen really frequently, ensuing in a big log file. To avoid intolerably big log files, the contents of the log files should be exported and the files cycled. To rhythm the SQL Server Log, execute the sp_cycle_error log stored process. To rhythm the Agent Log the sp_cycle_agent_errorlog stored process is used. These processs clear the contents of the logs without necessitating a service restart.

The figure of logs that SQL Server keeps can be configured by right-clicking the SQL Server Logs booklet and selecting Configure. The lower limit and default figure of logs is 6, but it can be increased to every bit many as 99. The figure can non be less than 6.

Q3: What type of alteration we take for supervising database.

Monitoring Database Modifications

Like many people in the information engineering field, I cut my dentitions in desktop support, so moved on to web support, and eventually settled in with SQL Server. I ca n’t get down to number how many times I began a support conversation with, “ Have you changed anything late? ” merely to hear the transcribed response, “ No, I have n’t done anything. It merely stopped working. ” I bet you can associate. As a database decision maker, your audience has changed a spot, but when a database application all of a sudden quits working, I can about vouch that you will hear the same reply from database and application developers, ” I did n’t make anything, it merely stopped working. ”

A really powerful new characteristic in SQL Server 2005 gives the DBA the ability to expose that claim with solid audit grounds that, so, something was changed to interrupt the database. This new characteristic is the ability to proctor and even prevent database alterations through the usage of Data Definition Language ( DDL ) triggers and event presentments.

Data Definition Language ( DDL ) Gun triggers

DDL triggers can be defined at the database and waiter range. Like traditional Data Modification Language ( DML ) triggers, DDL triggers fire after the event that the trigger is defined on. If a trigger is defined to forestall the dropping of a database, the database will be dropped foremost, and so put back when the trigger fires with ROLLBACK statement in it. This can turn out to be really dearly-won, but may be less dearly-won than holding to reconstruct the database from abrasion. Unlike traditional triggers, DDL triggers are defined on a peculiar statement or group of statements, irrespective of the object that the statement is directed to, and are non assigned to a peculiar object. As a consequence, a DROP_DATABASE trigger will fire no affair what database is dropped. In traditional DML triggers, a great trade of the functionality of the trigger is gained from entree to the Deleted and Inserted tabular arraies that exist in memory for the continuance of the trigger. DDL triggers do non utilize the Inserted and Deleted tabular arraies. Alternatively, if information from the event needs to be captured, the EVENTDATA map is used.

SQL Server Books Online contains a complete hierarchal listing of server- and database-level events that can used with DDL triggers. You can happen the list under the subject “ Event Groups for Use with DDL Triggers. ”

EVENTDATA Function

The EVENTDATA map returns an XML papers that contains predefined informations about the event that caused the trigger to put to death. The type of informations mostly depends on the event that caused the trigger to put to death, but all triggers return the clip the trigger was fired, the ID of the procedure that caused the trigger to put to death, and the type of event.

The undermentioned illustration creates a server-scoped DDL trigger that will put to death any clip a database is created, altered, or dropped:

  • USE Master ;
  • Travel
  • CREATE TRIGGER ServerDBEvent
  • ON ALL SERVER

Q4: What type of integrating services we use the most. Write all the stairss for importing and exporting informations from the ace.

Integration Service

The Integration Service itself is really managed through SQL Server Management Studio, non unlike many of your other server constituents. This constituent is used to manage the direction and monitoring of both stored and running bundles. Packages can be stored either in the file system or they can be stored in the msdb database on a running case of SQL Server 2005. Integration Services, when installed, assumes that the local default case contains the msdb database that will be used for the bundle depository. However, because SQL Server 2005 and SQL Server 2000 can both be installed on the same machine, it is possible that your default case is running a bequest version of SQL Server. If this is the instance, you must manually redact thecomponent in the MSDtsSrvr.ini.xml file. This file is in the 90DTSBinn directory of your SQL Server installing booklet.

You can utilize SQL Server Management Studio to link to an case of SSIS, as shown in Figure 13-1. The followers is a list of manageable characteristics of the Integration Service in SQL Server Management Studio:

  • Connect to multiple Integration Services waiters
  • Manage bundle storage
  • Customize storage booklets
  • Import and export bundles
  • Start local and distant stored bundles
  • Stop local and distant running bundles
  • Monitor local and distant running bundles
  • View the Windows Event log

Integration Servicess Object Model

Integration Services includes a new object theoretical account for including both native and managed application scheduling interfaces ( APIs ) for custom-making the behaviour of your Integration Services solutions. You can utilize these APIs for accessing SSIS tools, command-line maps, or usage applications. You can besides utilize the object theoretical account for put to deathing SSIS tools and bundles from within your ain applications.

Integration Services Run-time

The Integration Services Run-time engine is responsible for salvaging the control flow logic and executing of SSIS bundles. Integration Services run-time executables include bundles, containers, predefined and custom undertakings, and event animal trainers. The run-time grips executing order, logging, variables, and event handling. Programing the Integration Services Run-time engine allows you to automatize the creative activity, constellation, and executing of bundles through the object theoretical account.

Introduction to SQL Server 2005 Integration Services

Integration Servicess Packages

Packages are units of executing that are composed of a series of other elements, including containers, undertakings, and event animal trainers. You can make and pull off bundles through the BI Development Studio, or programmatically, utilizing the Integration Services object theoretical account. Each bundle contains a control flow, which is a series of undertakings ( related or non ) that will put to death as a unit. Similar to occupations in the SQL Server Agent service ( see Chapter 8 ) , Integration Services packages use a customizable logic flow that controls the timed or forced executing of single undertakings.

Integration Servicess Tasks

Undertakings are the basic unit of work within an Integration Services bundle. Each undertaking defines an action that will be taken as portion of the executing of this bundle. Some of the basic undertaking types include Execute SQL undertakings, in which a T-SQL book will be executed ; file system undertakings, which interact with a local or remote file system ; and informations flow undertakings, which control how informations is copied between a beginning and finish. Many other types of undertakings are discussed subsequently in this chapter.

Integration Services Containers

Containers are objects that exist within the Integration Services environment to let you to specify one or more undertakings as a unit of work. You can utilize containers to specify parametric quantities for the executing of these undertakings. Four types of containers are available, and you learn more about them subsequently in this chapter.

Integration Services Event Handlers

Event animal trainers are similar to bundles, in that within them, you can specify undertakings and containers. One major difference, though, is that event animal trainers are reactionist. This means that the undertakings defined within an event animal trainer will merely be executed when a specific event occurs. These events are defined on undertakings, containers, or the bundle itself, and include events that are fired before, during, and after the executing of the bundle.

Integration Services Data Flow

One of the most important benefits of the SSIS characteristics is the separation of the control flow from the information flow. Each bundle that contains a information flow undertaking ( such as an import or export ) identifies the informations flow undertaking to the run-time engine, but so a separate informations flow engine is invoked for that operation. The information flow engine manages what is typically the whole point of an SSIS bundle, and that is pull outing, transforming, and lading informations. The information flow engine will pull out informations from informations files or relational databases, manage any and all transforms that manipulate that informations, and so supply that transformed informations to the finish. A bundle may hold more than one information flow undertaking, and each undertaking will put to death its ain informations flow procedure for traveling and pull stringsing informations.

Importing and Exporting Data

One of the easiest ways to understand SSIS, and to see it in action, is through the Import/Export Wizard, which can be run from the Management Studio. The procedure is basically the same for both operations. The primary difference between the import operation and the export operation is whether your SQL Server is the beginning or the finish. It should be noted, nevertheless, that SSIS does n’t necessitate to utilize a SQL Server as either the beginning or the finish! You can utilize SSIS to import informations from a flat-file beginning ( such a comma-separated value file ) into a Microsoft Access database.

In this illustration, you use a simple comma-separated value ( CSV ) file that contains a list of extra publicities the Adventure Works gross revenues squad will utilize for 2007. The contents of this file will so be imported into the Sales.SpecialOffer tabular array. Get down by making a booklet on the root of your C: thrust called SSISDemos. Make a new text file in this booklet, and enter the undermentioned information into the text file.

Description, DiscountPct, Type, Category, StartDate, EndDate

  1. Start or unfastened SQL Server Management Studio.
  2. In the Object Explorer, choose your waiter and expand Databases.
  3. Right-click AdventureWorks and choice Tasks > Import Data. This will establish the SQL Server Import and Export Wizard. As with many other aces in SQL Server Management Studio, you can elect to non see the introductory page of the ace in the hereafter.
  4. Click Next to travel to the Data Source choice page. On this page, you can choose the informations beginning that will be used for the import procedure. Figure 13-3 shows the informations beginning as you will configure it for this exercising. You should observe that the options in the window will alter to reflect the parametric quantities of whichever informations beginning you choose. Besides note that a figure of connexion suppliers are already available out-of-the-box from SQL Server 2005, including the SQL Native Client, OLE DB suppliers for Analysis Services and Oracle, and Microsoft Office Excel and Access file formats.

Introduction to SQL Server 2005 Integration Services

Now, follow these stairss:

  1. Choice Flat File Source as your informations beginning.
  2. In the “ File name ” box, you can come in the way to the file you created earlier ( C: SSISDemos Promos.csv ) , or you can utilize the Browse button to happen the file. Note that if you use the Browse button, it defaults to the.txt file extension, and you must choose.csv from the drop-down list.
  3. Based on the contents of the file, it should acknowledge the right venue, codification page, and format. You should choose the “ Column names in the first information row ” checkbox, because you have included the heading row in your file.
  4. Click Next on the Data Source choice page to take you to the Columns page. This page, as shown in Figure 13-4, allows you to configure the row and column delimiters that are used in your level file beginning. For illustration, if you used a pipe character ( | ) alternatively of a comma, you could come in that into the “ Column delimiter ” field.

Figure 13-4: Puting the level file beginning options

This window besides provides you with a columnized prevue window of the informations beginning, so that you can verify the constellation of the informations beginning supplier. If the columns appear to be misaligned, or the information does non look in the right format, you may be utilizing a different column or row delimiter, and will necessitate to set your scenes consequently.

Before snaping Next, choose the Advanced window. This window, represented in Figure 13-5, allows you to see or configure the belongingss of each column. This can be helpful when fixing informations for the finish, and guaranting that it is in the right format. You besides have the ability to add or take columns as needed and can utilize the Suggest Types button to measure the informations in the file, and supply recommendations for the information types prior to importing the information.

Click Next without doing any alterations on the Advanced page. The following measure in the ace asks you to supply constellation information for the information finish. As mentioned to earlier, you can utilize any of the available suppliers for both beginning and finish, and neither of them has to be a SQL Server. You could, in fact, utilize this import ace to import the level file into a Microsoft Excel spreadsheet, a Microsoft Access database, an Oracle database, or even another level file. This functionality allows you to utilize SSIS to construct complex bundles that may hold really distinguishable informations migration waies before the executing can finish.

For this illustration, though, you ‘re traveling to do it easy on yourself, and take the SQL Native Client as your finish supplier ( this should be the default ) . When taking the SQL Native Client, follow these stairss:

  1. Choose your waiter name from the drop-down list if it is non already provided.
  2. Choose Windows Authentication.
  3. Ensure that AdventureWorks is selected as the mark database.

The New button on this screen allows you to make a new database on this waiter for your informations import. Figure 13-6 displays the constellation information you should utilize. The following page in the ace allows you to specify the specific positions or tabular arraies that will be used as the finish for the informations in your file. The default behaviour of the ace is to make a new tabular array based on the name of the file, utilizing the information types that were specified in the beginning constellation. You can utilize the drop-down list to choose the correct tabular array, as shown in Figure 13-7.

Snaping the Edit button under the Mapping column activates the Column Mapping window. This displays the column name in the information beginning, and allows you to fit it to a column name in the finish. Fortunately, the file you created earlier happens to utilize the exact same column names as the finish tabular array, so there is no guessing as to where the information will travel. However, you can utilize this public-service corporation to stipulate that certain columns will be ignored, or merely mapped to a different mark column.

You can besides see in Figure 13-8 that there are options to cancel or add on rows in the finish tabular array. In some instances, you may desire to drop the tabular array and so re-create it. This can be particularly helpful if you want to wholly purge the tabular array, and there are no foreign-key restraints on it. The following page in the ace allows you to specify the specific positions or tabular arraies that will be used as the finish for the informations in your file. The default behaviour of the ace is to make a new tabular array based on the name of the file, utilizing the information types that were specified in the beginning constellation. You can utilize the drop-down list to choose the correct tabular array, as shown in Figure 13-7.

Once you have provided all the information about the beginning and finish, the ace will inquire if you want to put to death the bundle instantly, and if you want to salvage your constellation as an SSIS bundle in either the msdb database or the file system ( see Figure 13-9 ) . For now, merely take to put to death the bundle instantly, and do n’t worry about salvaging the bundle. You can either click Finish on this page to get down bundle executing, or you can snap Next to see the drumhead information about the bundle before put to deathing.

Equally long as all the stairss have been followed as indicated, your informations should now be imported successfully! You can put to death a simple SELECT * FROM Sales.SpecialOffer question to see the imported information. If there was a job with the executing of the bundle, use the mistake study information to nail where the job occurred, and what could be done to decide it. Now that you ‘ve had a opportunity to see the import ace at work, you should see the export ace in action! It ‘s really reasonably much more of the same. This clip, nevertheless, you can export informations utilizing a question and salvage the consequences as a Microsoft Excel file. Follow these stairss:

  1. Get down by right-clicking the AdventureWorks database and choosing Tasks > Export Data.
  2. In the Choose a Data Source page, guarantee that the SQL Native Client is specified as the Data Source, Windows Authentication is selected, and the database is AdventureWorks.
  3. Once you ‘ve configured or confirmed as necessary, click Next.
  4. In the Choose a Destination page, take Microsoft Excel as the finish.
  5. Enter ‘C: SSISDemosEmployeeData.xls ‘ as the file name, and utilize the default Excel version Introduction to SQL Server 2005 Integration Services
  6. Ensure that “ First row has column names ” is selected and click Next.
  7. In the following window, you will be asked if you want to choose informations from bing tabular arraies and positions, or if you want to stipulate a question to happen the information to reassign. Choose the 2nd option, “ Write a question to stipulate the information to reassign, ” and click Next.
  8. In the Source Query window, enter the undermentioned question and click Next: Use AdventureWorks ; GO SELECT PC.FirstName, PC.LastName, PC.EmailAddress, HRE.MaritalStatus, HRE.Gender, HRE.VacationHours, HRE.SickLeaveHours, HRE.SalariedFlag, HREP.Rate, HREP.ModifiedDate FROM Person.Contact AS Personal computer INNER JOIN HumanResources.Employee AS HRE ON PC.ContactID = HRE.ContactID INNER JOIN HumanResources.EmployeePayHistory AS HREP ON HRE.EmployeeID = HREP.EmployeeID ORDER BY PC.LastName ;
  9. In the Select Source Tables and Views window, you can alter the column function options, or rename the finish worksheet for the new file. Feel free to research the different options, but do non alter the defaults.
  10. Once you ‘re finished, click Next, and so snap Finish on the following page to put to death the bundle instantly.
×

Hi there, would you like to get such a paper? How about receiving a customized one? Check it out