Monthly Archives: November 2013

Tuning IIS web server for best performance

 TUNING

Recommended settings for Best performance 

Assumptions (Load on server is 250 to 300 concurrent user, Machine of  4 logical processor)

Enabled ASP Feature in IIS - this enabled us to do the further settings like Threads per processor limit etc.

Threads per Processor limit - This setting was by default set to 25, according the load you may extended it to 100.

Queue Length Property – Value is 3000 by default, but now we have set it 400 because this property  is supposed to be 4*Thread Per Processor hence coming out to be 400.

Max Pool Threads - This setting specifies the number of pool threads to create per processor. This entry was not there we made this entry at HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\InetInfo\Parameters\ with a value of 20.

Process Model- We can enabled the process model element and configured it to the following values in machine config-

  1. Max Worker Thread – 100
  2. Max I/O Threads – 100
  3. Min Worker Thread – 50

HTTP Runtime element- We have made some changes in this element also:

  1. minFreeThreads – 352 (88 * N ) where N is no of CPUs (4)
  2. minLocalRequestFreeThreads – 304 (76*N)

Connection Pool Changes for SQL Server - connection time-out was by default set to 3600 sec which is a very high value and not recommended because in case of high load when all pools will be used by the pool connection there will be no pool available ultimately. The recommended value is 15 but we have set it to 30. And we have also increased the pool size to 300.

These setting could be vary according the the number of logical processor ,you may also enable the IIS logging to monitor your server /IIS functioning for more effective tuning

Database replication through Log Shipping over FTP

 

Image1

Steps for Log Shipping 

Log Shipping is a basic level SQL Server high-availability technology that is part of SQL Server. It is an automated backup/restore process that allows you to create another copy of your database for failover.

Log shipping involves copying a database backup and subsequent transaction log backups from the primary (source) server and restoring the database and transaction log backups on one or more secondary (Stand By / Destination) servers. The Target Database is in a standby or no-recovery mode on the secondary server(s) which allows subsequent transaction logs to be backed up on the primary and shipped (or copied) to the secondary servers and then applied (restored) there.

Flow of given solution

image2

Permissions

To setup a log-shipping you must have sysadmin rights on the server.

Minimum Requirements

  1. SQL Server 2005 or later
  2. Standard, Workgroup or Enterprise editions must be installed on all server instances involved in log shipping.
  3. The servers involved in log shipping should have the same case sensitivity settings.
  4. The database must use the full recovery or bulk-logged recovery model
  5. A shared folder for copying T-Log backup files
  6. SQL Server Agent Service must be configured properly

In addition, you should use the same version of SQL Server on both ends. It is possible to

Log Ship from SQL 2005 to SQL 2008, but you can not do it the opposite way. Also, since Log Shipping will be primarily used for failover if you have the same versions on each end and there is a need to failover you at least know you are running the same version of SQL Server.

In current scenario we are going use log shipping through FTP

Image3

Steps to Configure Log-Shipping:

  1. Make sure your database is in full or bulk-logged recovery model. You can change the database recovery model using the below query. You can check the database recovery model by querying sys.databases

SELECT name, recovery_model_desc FROM sys.databases WHERE name = ‘TestD’  USE [master]

GO

ALTER DATABASE [TestD] SET RECOVERY FULL WITH NO_WAIT GO

2. On the primary server, right click on the database in SSMS and select Properties. Then select the Transaction Log Shipping Page. Check the “Enable this as primary database in a log shipping configuration” check box.

IM4

3.  The next step is to configure and schedule a transaction log backup. Click on Backup Settings… to do this.

im5

If you are creating backups on a network share enter the network path or for the local machine you can specify the local folder path. The backup compression feature was introduced in SQL Server 2008 Enterprise edition. While configuring log shipping, we can control the backup compression behavior of log backups by specifying the compression option. When this step is completed it will create the backup job on the Primary Server.

im6

 Initialize database on Secondary Server

In this step you can specify how to create the data on the secondary server. Create a backup of primary database and restore it or use an existing backup and restore or do nothing because you have manually restored the database and have put it into the correct state to receive additional backups.

Copy Files

Here you have to specify the path of the Destination FTP Folder where the Log Shipping Copy job will copy the T-Log backup files. We have to create the Copy job (Power Shell script) on the client machine to and schedule the job on windows Task scheduler

PowerShell Script for copying Log Shipping file to FTP

#we specify the directory where all files that we want to upload are contained
$Dir=”C:\LSDemo\Primary\”
# we specify the directory where all files will move after upload
$NewDir=”C:\LSDemo\BackupPServerLog\”
#ftp server
$ftp = “ftp://10.200.5.XXX/dir/”
$user = “Administrator”
$pass = “ABCDXXX”
Try
{
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
#list every sql server transaction Log file Start-Sleep -s 30 foreach($item in (dir $Dir “*.trn”))
{
#”Uploading $item…”
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
#moving the file in other folder move-item $Dir$item $NewDir
}

}
Catch
{

$ErrorMessage = $_.Exception.Message
$FailedItem = $_.Exception.ItemName

# optional script for getting notification on mail
$EmailFrom = “xyz@gmail.com”
$EmailTo = “abc@live.com”
$Subject = “Notification from XYZ”
$Body = “We failed to upload file $FailedItem. The error message was $ErrorMessage” $SMTPServer = “smtp.gmail.com”
$SMTPClient = New-Object Net.Mail.SmtpClient($SmtpServer, 587)
$SMTPClient.EnableSsl = $true
$SMTPClient.Credentials = New-Object System.Net.NetworkCredential(“xyz@gmail.com”, “password”); $SMTPClient.Send($EmailFrom, $EmailTo, $Subject, $Body)

Break
}
Finally
{
$Time=Get-Date
“This script made a Upload attempt at $Time” | out-file c:\logs\ExpensesScript.log -append
}

 Restore Transaction Log

Here you have to specify the database restoring state information and restore schedule. This will create the restore job on the secondary server. Schedule Job in SQL Server Agent, also provide the require permission to Sql Server Agent on log containing folders

– To allow advanced options to be changed.
EXEC sp_configure ‘show advanced options’, 1
GO
– To update the currently configured value for advanced options.
RECONFIGURE
GO
– To enable the feature.
EXEC sp_configure ‘xp_cmdshell’, 1
GO
– To update the currently configured value for this feature.
RECONFIGURE
GO

———- Script————–
USE Master;
GO
SET NOCOUNT ON
– 1 – Variable declaration
DECLARE @dbName sysname
DECLARE @backupPath NVARCHAR(500)
DECLARE @cmd NVARCHAR(500)
DECLARE @fileList TABLE (backupFile NVARCHAR(255))
DECLARE @StandByFile
DECLARE @NewLocation
DECLARE @backupFile NVARCHAR(500)

– 2 – Initialize variables
SET @dbName = ‘TestD’
SET @backupPath = ‘G:\LSPro\Primary\’
SET @ NewLocation = ‘G:\LSPro\LogBackup\’
SET @StandByFile = ‘G:\MSSQL10_50.TestD\MSSQL\Backup\ROLLBACK_UNDO_TestD.BAK’
BEGIN TRY

– 3 – get list of files
SET @cmd = ‘DIR /b ‘ + @backupPath INSERT INTO @fileList(backupFile)
EXEC master.sys.xp_cmdshell @cmd
– 4 – check for log backups

DECLARE backupFiles CURSOR FOR SELECT backupFile FROM @fileList WHERE backupFile LIKE ‘%.TRN’ AND backupFile LIKE @dbName + ‘%’

OPEN backupFiles
– Loop through all the files for the database
FETCH NEXT FROM backupFiles INTO @backupFile
WHILE @@FETCH_STATUS = 0
BEGIN

—– restore Log file—

SET @cmd = ‘RESTORE LOG ‘ + @dbName + ‘ FROM DISK = ”’ + @backupPath + @backupFile + ”’ WITH NORECOVERY’ EXECUTE (@cmd)

—– Delete or Move Log file—

SET @cmd =’del’+ @backupPath + @backupFile or
SET @cmd =’MOVE’+@backupPath + @backupFile+’ ‘+@NewLocation

EXEC master.sys.xp_cmdshell @cmd

FETCH NEXT FROM backupFiles INTO @backupFile
END
END TRY
BEGIN CATCH

CLOSE backupFiles

DEALLOCATE backupFiles

– 5 – put database in a standby state
SET @cmd = ‘RESTORE DATABASE ‘ + @dbName + ‘ WITH STANDBY =’ + @StandByFile EXECUTE (@cmd)
exec master..xp_cmdshell ‘echo Error in Restore logs > c:\datarecoverylogs\logs.txt’

END CATCH

CLOSE backupFiles

DEALLOCATE backupFiles

– 5 – put database in a standby state
SET @cmd = ‘RESTORE DATABASE ‘ + @dbName + ‘ WITH STANDBY =’ + @StandByFile EXECUTE (@cmd)

——–End——–

PowerShell Script to Delete Logs files from Servers(older than X days)

#—– define parameters —–#
#—– get current date —-#
$Now = Get-Date
#—– define amount of days —-#
$Days = “3″
#—– define folder where files are located —-#
$TargetFolder = ‘G:\LSPro\LogBackup\’
#—– define extension —-#
$Extension = “*.trn”
#—– define LastWriteTime parameter based on $Days —#
$LastWrite = $Now.AddDays(-$Days)
#—– get files based on lastwrite filter and specified folder —#
$Files = Get-Childitem $TargetFolder -Include $Extension -Recurse | Where {$_.LastWriteTime -le
“$LastWrite”}
foreach ($File in $Files)
{
if ($File -ne $NULL)
{
write-host “Deleting File $File” -ForegroundColor “DarkRed”
Remove-Item $File.FullName | out-null
} else {
Write-Host “No more files to delete!” -foregroundcolor “Green” }

}

Apply this PowerShell Script on Primary server and Secondary server to remove old log files

To Schedule a PowerShell Script in Windows Task Scheduler 

Running PowerShell as Windows scheduled tasks – so here’s a quick summary what I see as the way to do this

1.  Get your script ready

Surprising as it might sound, your script might actually not be ready to run in a scheduled task as is. This happens if it uses cmdlets from a particular PowerShell module or snapin, and it worked for you interactively because you used a specialized shell (e.g. Exchange Management Shell) or a tool like PowerGUI Script Editor which loads the modules for you.

If you indeed are using using any non-default cmdlets, simply add Add-PSSnapin or ImportModule to the beginning of the script. For example:

Add-PSSnapin Quest.ActiveRoles.ADManagement

2.  Schedule the task

To schedule a task simply start Windows Task Scheduler and schedule powershell.exe executable passing the script execution command as a parameter. The -File parameter is the default one so simply specifying the script path as the argument would work in a lot of cases:

 

You can find powershell.exe in your system32\WindowsPowerShell\v1.0 folder.

3. Report task success or failure

If you want your script to report success or failure (or some sort of other numerical result) simply use the exit keyword in the script to pass the value, e.g.:

exit 4

Then your Windows Task Scheduler will show the value in the Last Run Result (you might need to hit F5 to refresh the column in the task scheduler):

 

4. Passing parameters

If you need to pass parameters things get a little trickier. Say, you have a script which adds two numbers:

param($a=2, $b=2)

“Advanced calculations ahead” exit $a + $b

To pass the numbers as parameters, you would want to use powershell.exe -

Command instead of powershell.exe -File. This -Command argument will then have the script invocation operator &, path to the script, and the parameters. E.g.:

Program:C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Add argument (optional): -Command “& c:\scripts\hello.ps1 -a 2 -b 3″

If you want to also get your exit code from the script, you would need to re-transmit that by adding exit $LASTEXITCODE to the command (I learnt this tip from MoW). E.g.

Program:C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Add argument (optional): -Command “& c:\scripts\hello.ps1 -a 2 -b 3; exit

$LASTEXITCODE”

5.  Run x86 PowerShell on x64 Windows

On 64-bit versions of Windows you actually have both 64-bit and 32-bit versions of

PowerShell. In most cases you don’t care but in some cases (e.g. specific COM objects being used) you might need specifically a 32-bit version. To get that to run, simply pick the proper executable when you schedule the task:

Regular PowerShell (64-bit version on 64-bit

Windows):%SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe

32-bit PowerShell (x86):%SystemRoot%\syswow64\WindowsPowerShell\v1.0\powershell.exe

6.  Other options

To learn about all parameters PowerShell executable has simply run it with /? option (from either cmd.exe or a PowerShell session).

I normally use -noprofile to make sure that nothing in the PowerShell profile interferes with the task.

Also, if your Execution Policy does not allow running scripts the -ExecutionPolicy parameter comes handy allowing you to make an exception just for this task. E.g.:

c:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -NoProfile -File

c:\scripts\hello.ps1 -ExecutionPolicy RemoteSigned

 

Alternatives to using CAPICOM for digitally sign document

CAPICOM is a 32-bit only component that is available for use in the following operating systems: Windows Server 2008, Windows Vista, Windows XP for Windows 8 or Windows Server 2012 Systems alternatives to CAPICOM offer a solution for scripts; therefore, you must write your own ActiveX control

  • Certificate Store Objects
  • Digital Signature Objects
  • Enveloped Data Objects
  • Data Encryption Objects
  • Auxiliary Objects

 

Digital Signature Objects

We suggest the following alternatives to digitally sign data and to verify digital signatures.

Object :          SignedCode

Alternative: The SignedCode object is available for use in the operating systems specified in the Requirements section. Instead, use Platform Invocation Services (PInvoke) to call the Win32 API SignerSignEx, SignerTimeStampEx, and WinVerifyTrust functions to sign content with an Authenticode digital signature. For information about PInvoke, see Platform Invoke Tutorial. The .NET and CryptoAPI via P/Invoke: Part 1 and .NET and CryptoAPI via P/Invoke: Part 2 subsections of Extending .NET Cryptography with CAPICOM and P/Invoke may also be helpful.

Object :           SignedData

Alternative:  The SignedData object is available for use in the operating systems specified in the Requirements section. Instead, use the SignedCms Class in the System.Security.Cryptography.Pkcs namespace.
Object :           Signer
Alternative:  The Signer object is available for use in the operating systems specified in the Requirements section. Instead, use the CmsSigner Class in the System.Security.Cryptography.Pkcs namespace.

Object :           Signers
Alternative:  The Signers object is available for use in the operating systems specified in the Requirements section. Instead, use a collection of CmsSigner objects. For more information, see the CmsSigner Class in the System.Security.Cryptography.Pkcs namespace.

Example : Signing XML document

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Security.Cryptography.X509Certificates;
using System.Xml;
using System.Security.Cryptography.Xml;
using System.Text;
using System.Security.Cryptography;

namespace DSAPP
{
public partial class SignCert : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
try
{
X509Store store = new X509Store(StoreName.My);
store.Open(OpenFlags.ReadOnly);

// Create a new XML document.
XmlDocument doc = new XmlDocument();

// Load the passed XML file using its name.
doc.Load(new XmlTextReader(@”C:\XMLfile\Example.xml”));

if (store.Certificates.Count > 0)
{
foreach (X509Certificate2 cert in store.Certificates)
{

if (cert.Issuer.Contains(“CN=(n)Code Solutions CA 2011-1″))
{

SignXmlFile(@”C:\XMLfile\Example.xml”, @”C:\XMLfile\signedExample.xml”, cert);
Response.Write(“XML file signed.”);

Response.Write(“Verifying signature…”);
bool result = VerifyXmlFile(@”C:\XMLfile\signedExample.xml”, cert);

// Display the results of the signature verification to
// the console.
if (result)
{
Response.Write(“The XML signature is valid.”);
}
else
{
Response.Write(“The XML signature is not valid.”);
}

Response.Write(“\n Sign By ” + cert.ToString());

}

}
}

}
catch (CryptographicException ex)
{
Response.Write(ex.Message);
}
}

public void SignDocument(XmlDocument doc, string id, X509Certificate2 cert)
{
SignedXml signedXml = new SignedXml(doc);
signedXml.SignedInfo.CanonicalizationMethod = SignedXml.XmlDsigExcC14NTransformUrl;
signedXml.SigningKey = cert.PrivateKey;

// Retrieve the value of the “ID” attribute on the root assertion element.
// Reference reference = new Reference(“#” + id);
Reference reference = new Reference();
reference.Uri = “”;

reference.AddTransform(new XmlDsigEnvelopedSignatureTransform());
reference.AddTransform(new XmlDsigExcC14NTransform());

signedXml.AddReference(reference);

// Include the public key of the certificate in the assertion.
signedXml.KeyInfo = new KeyInfo();
signedXml.KeyInfo.AddClause(new KeyInfoX509Data(cert, X509IncludeOption.WholeChain));

signedXml.ComputeSignature();
// Append the computed signature. The signature must be placed as the sibling of the Issuer element.
XmlNodeList nodes = doc.DocumentElement.GetElementsByTagName(“Issuer”, Saml20Constants.ASSERTION);
// doc.DocumentElement.InsertAfter(doc.ImportNode(signedXml.GetXml(), true), nodes[0]);
nodes[0].ParentNode.InsertAfter(doc.ImportNode(signedXml.GetXml(), true), nodes[0]);
Response.Write(“Doc Signed”);
}

public static void SignXmlFile(string FileName, string SignedFileName, X509Certificate2 cert)
{
// Create a new XML document.
XmlDocument doc = new XmlDocument();

// Load the passed XML file using its name.
doc.Load(new XmlTextReader(FileName));

// Create a SignedXml object.
SignedXml signedXml = new SignedXml(doc);

// Add the key to the SignedXml document.
signedXml.SignedInfo.CanonicalizationMethod = SignedXml.XmlDsigExcC14NTransformUrl;
signedXml.SigningKey = cert.PrivateKey;

// Create a reference to be signed.
Reference reference = new Reference();
reference.Uri = “”;

// Add an enveloped transformation to the reference.
XmlDsigEnvelopedSignatureTransform env = new XmlDsigEnvelopedSignatureTransform();
reference.AddTransform(env);

// Add the reference to the SignedXml object.
signedXml.AddReference(reference);

// Compute the signature.
signedXml.ComputeSignature();

// Get the XML representation of the signature and save
// it to an XmlElement object.
XmlElement xmlDigitalSignature = signedXml.GetXml();

// Append the element to the XML document.
doc.DocumentElement.AppendChild(doc.ImportNode(xmlDigitalSignature, true));

if (doc.FirstChild is XmlDeclaration)
{
doc.RemoveChild(doc.FirstChild);
}

// Save the signed XML document to a file specified
// using the passed string.
XmlTextWriter xmltw = new XmlTextWriter(SignedFileName, new UTF8Encoding(false));
doc.WriteTo(xmltw);
xmltw.Close();
}

// Verify the signature of an XML file against an asymetric
// algorithm and return the result.
public static Boolean VerifyXmlFile(String Name, X509Certificate2 cert)
{
// Create a new XML document.
XmlDocument xmlDocument = new XmlDocument();

// Load the passed XML file into the document.
xmlDocument.Load(Name);

// Create a new SignedXml object and pass it
// the XML document class.
SignedXml signedXml = new SignedXml(xmlDocument);

// Find the “Signature” node and create a new
// XmlNodeList object.
XmlNodeList nodeList = xmlDocument.GetElementsByTagName(“Signature”);

// Load the signature node.
signedXml.LoadXml((XmlElement)nodeList[0]);

// Check the signature and return the result.
return signedXml.CheckSignature(cert,true);
}

}
}

Hello Everyone !

Welcome to Study Desk. This is the place where you and me try to explore about  old and new technologies which may guide us to make better future !

Welcome