Wednesday, July 25, 2018

MS SQL: How to configure log shipping with FreeFileSync over FTP/SFTP

In MS SQL, log shipping is a popular technique for:
  • Copy and sync a database to multiple servers
  • High availability / failover / disaster recovery / backup plan
  • Data warehouse / reporting server
Comparing with mirroring technique, although log shipping has late time (10-15 minutes) bigger than mirroring (few seconds) but we can use the secondary database(s) of log shipping for reading while we can not do anything with mirrored database (only one mirrored database). So depending on how your system wants, you can use both of these techniques or one of them.

Normally, log shipping is configured natively with a shared folder which the secondary server can access via  private / public network. You can configure the shared folder to limit for some users or some IP addresses (by firewall). However, the disadvantages of shared folder are security issues when sharing over Internet and we cannot control bandwidth when transferring file data.

In this article, I want to introduce you another way to do the log shipping via FTP/SFTP by using
by using FreeFileSync which can solve above disadvantages.

1. Prepare environment

On the secondary server:
  • Setup FTP server (can use FileZilla Server) on the secondary server. Then configuring a FTP folder for receiving transaction log files from the primary server. For example, the FTP folder is configured with a local folder C:\FTP\FileSync.
  • Create a folder (e.g. C:\Translog) for copying log files synced from C:\FTP\FileSync to.
  • Grant full permission for user running SQL Server Agent service (normally it is NT Service\SQLSERVERAGENT, let check in Services tool) and user running SQL Server service (normally it is NT Service\MSSQLSERVER) to control the folder C:\Translog.
  • Open C:\Windows\System32\drivers\etc\host with Administrator right then add a line with IP to point the primary server name, for example: 192.168.72.100 PRIMARY_SERVER_NAME (to know the host name, open CMD on the primary server then run command hostname).
On the primary server:
  • Create a folder save log files exported (e.g. Z:\Translog). Grant full permission for user running SQL Server Agent service and user running SQL Server service to control the folder Z:\Translog.
  • Open C:\Windows\System32\drivers\etc\host with Administrator right then add a line with IP to point the secondary server name, for example: 192.168.72.101 SECONDARY_SERVER_NAME
  • Download FreeFileSync and install it on the primary sever.
  • Create a batch job on FreeFileSync for syncing files (Mirror way) from Z:\Translog to the FTP folder on secondary primary. See the tutorial video on https://freefilesync.org/tutorials.php and the following pictures for how to.
  • Setup RealTimeSync for the batch job for watching on Z:\Translog and syncing files real time. See the tutorial video on https://freefilesync.org/tutorials.php and the following pictures for how to.



2. Setup log shipping on the primary server by using MS SQL Management Studio

Make sure your DB is Full recovery model (or Bulk-logged). Right click on your DB >> select Properties menu and check & change in Options section:


Move to Transaction Log Shipping section and tick Enable this as a primary database...:


Then click Backup Settings button to pop up below form:


In the box Network path to backup folder..., let key \\YOURSERVER\SF or any kind of shared folder you want. That is a fake folder which we won't use, just key in for enabling OK button. The second box, let key in the folder Z:\Translog which you prepared. You can change the other parameters according your need. I recommend to use Compress backup option for saving bandwidth when transferring logs via Internet. Click OK button to finish this form.

In Monitor server instance area, tick Use a monitor server instance then click Settings button to bring up the following form:


Click Connect button then connect to the secondary server, you can change the value of Delete history after and click OK button to finish.

Back to Transaction Log Shipping section, click OK button to establish configurations for the primary server. It will enable the primary database for log shipping and create 2 jobs: LSBackup_XXX in SQL Server Agent >> Jobs of the primary server and LSAlert_XXX in SQL Server Agent >> Jobs of the secondary server.

Right click your DB, select menu Task >> Back Up... select Backup type as Full. Then backup your DB to disk. This backup file will be copied to the secondary server for initializing secondary database.

Before going to the secondary server and configuring, we will do a trick to get a script for running on that server. Open back Transaction Log Shipping section, on Secondary databases click Add button to open below form:


Click Connect button and connect to the Secondary server instance and Secondary database (e.g. TestDB). Select option "No, the secondary database is initialized" then move to Copy Files tab:


Key in the folder in the secondary server for copying transaction log files (e.g. C:\Translog_Copy). Move to Restore Transaction Log tab:


Select Standby mode and tick "Disconnect user..." to put the secondary database into read only mode and disconnect users when restoring. Then click OK button.
Back to Transaction Log Shipping section, in the bottom click Script Configuration >> Script Configuration to New Query Window, ah ha let see the script generated. Before viewing this script, let click Cancel button to close Transaction Log Shipping without doing any thing.


Go to the query window opened and copy the script between "Begin: Script to be run at Secondary" and "End: Script to be run at Secondary" to run on the secondary server later.

3. Setup log shipping on the secondary server

First of all, let restore the full backup of primary database on the secondary server by using RESTORE WITH STANDBY (you should restore before generating the script on the step above):


After restoring, you will see the DB with text (Standby / Read-Only). Open new query windows and paste copied script above into here (remember to choose msdb for running the script later):


Yes, nearly done! We just modify the script a little bit to change the fake shared folder (\\YOURSERVER\FORDER) to the local folder (C:\Translog) on the secondary server which is the FTP folder to receive log files transferred from the primary server, see below for your idea:


That's it. Run the script. It will create 2 scheduled jobs: LSCopy_XXX (for copying file from C:\Translog to C:\Translog_Copy per 15 minutes) and LSRestore_XXX (for restoring transaction logs in C:\Translog_Copy to the secondary database per 15 minutes) in SQL Server Agent >> Jobs. You can go there to change the schedule of 2 these jobs.

Voila! You got the secondary database synced with primary database with interval 15-30 minutes (depending the time between LSCopy_XXX and LSRestore_XXX).

That's all. Any comment is welcome.

Monday, July 16, 2018

Windows RDP: Solving the error "This could be due to CredSSP encryption oracle remediation"

Do you get the error as below when remoting your server?


It is normally caused by your Windows client getting new update on CredSSP but your Windows server is not.
For long term, you should update your server for more security.
For temporary solution, you can do the following steps on your Windows client to able to remote your server:

1. Open Run box, key gpedit.msc to open Local Group Policy Editor.

2. Navigate to Computer Configuration >> Administrative Templates >> System >> Credentials Delegation >> Encryption Oracle Remediation. Then select  Enable >> Vulnerable >> OK.


3. Finally, open CMD (run as Administrator), run the command: gpupdate /force
to update the policy immediately.

Hope you can remote your server and update it. Have a nice day.
Subscribe to RSS Feed Follow me on Twitter!