Wednesday, July 8, 2020

Calculating Monthly Balance for GLAccounts using BSEG

Hi ,
I have been working a lot with SAP related systems both ECC and CRM. Just wanted to tell you that it was a real nightmare when I started my journey almost an year back in 2019. So the SAP Systems have all the table and column names based in German. The is an organization called Simplement which adds a lot of helpful texts in English which help us to understand what the table/column actually means. Trust me it helps a lot to keep the sanity in place when you just start off. But later on once you get the hang of it you will remember the most important tables like VBAK, VBRK , BSEG , BKPF on top of your head. These tables are very broad and have multiple column names which sound similar and have similar data and sometimes it makes no sense to plain eyes why so many similar columns are required. But SAP treats or uses each column for a specific purpose and has these columns populated so it have less agrregations or calculations done on the fly.

Well to being with today's article/blog. I want to tell you what BSEG actually is and how you can do a balance for all GLAccounts for a company. Now BSEG is Accounting Document Segment and BKPF is Accounting Document Header
Here goes the complicated yet beautiful query.

SELECT h.[BUKRS: (PK) Company Code],s.[HKONT: General Ledger Account],h.[GJAHR: (PK) Fiscal Year], MONTH( h.[BUDAT: Posting Date in the Document]) PostingMonth,
  SUM(  CASE WHEN s.[SHKZG: Debit/Credit Indicator]='H' THEN s.[DMBE2: Amount in Second Local Currency]*-1 ELSE s.[DMBE2: Amount in Second Local Currency] END )
 GLAMountUSD 
FROM LIB_ECP_RTP.bv.[BSEG: Accounting Document Segment] s WITH (nolock)
JOIN LIB_ECP_RTP.bv.[BKPF: Accounting Document Header] h WITH (nolock) ON h.[BELNR: (PK) Accounting Document Number]=s.[BELNR: (PK) Accounting Document Number]
AND h.[BUKRS: (PK) Company Code] = s.[BUKRS: (PK) Company Code] AND h.[GJAHR: (PK) Fiscal Year] = s.[GJAHR: (PK) Fiscal Year]
AND MONTH(h.[BUDAT_SIMP_DT: (GC) Posting Date in the Document]) = h.[MONAT: Fiscal Period]
AND h.[MANDT: (PK) Client] = s.[MANDT: (PK) Client]
WHERE 1=1
AND h.[BUKRS: (PK) Company Code]='YourCompany'
AND h.[GJAHR: (PK) Fiscal Year]='2020'
AND s.[VORGN: Transaction Type for General Ledger]<>'RFIG'
AND s.[UMSKZ: Special G/L Indicator] NOT IN ('F','A')
GROUP BY  MONTH(h.[BUDAT: Posting Date in the Document]), h.[BUKRS: (PK) Company Code], s.[HKONT: General Ledger Account], h.[GJAHR: (PK) Fiscal Year]
ORDER BY h.[BUKRS: (PK) Company Code], s.[HKONT: General Ledger Account], h.[GJAHR: (PK) Fiscal Year] ,MONTH(h.[BUDAT: Posting Date in the Document])


Now we are using the BUKRS as the first column which is nothing but the name of the company. The next is the GLAccount which is what you actually what to see the balance monthly for. 
The GJAHR is the Actual year ; here we have filtered by 2020.
Next is BUDAT which is the data and we are taking the month from that, which is the line item we want to see for the balance. 
Next is GLAMountUSD which is the sum of the amount with a case for the debit and credit indicator. So when it's H it is to be mutipolied by  -1 so it is reversed as the amount column is not signed. 

There you so you have the monthly balance in a very condensed form. 

Thanks,
Tushar Kanti 











Monday, April 6, 2020

Restoring multiple databases via script

Hi Guys,
Hope you are having fun with SQL as usual.
I have developed a script which I thought may be useful to the community least they need to restore huge number of databases from backups without using any enteprise restore software.

Script is broken into two major parts
1. Getting the names of the databases that needs to be restored.
2. Getting the restore script for the databases in question

Now let's delve into the Scripts.
1. Names of the Databases 
Here in this example I am restoring the databases in offline mode. Just a scenario we can use this to be any scenario.

DECLARE @db_row INT
SELECT @db_row=COUNT(name) FROM sys.databases where state_desc<>'online';
DECLARE @res_script varchar(max)

SELECT TOP (@db_row) b.database_name,physical_device_name,b.type,b.backup_start_date
INTO #backup_tab
FROM msdb..backupset b
join msdb..backupmediafamily f on f.media_set_id=b.media_set_id and type ='d'
AND database_name IN (select name from sys.databases where state_desc<>'online')
order by b.backup_start_date DESC

Now we have the information of the backup set of the databases in question. Now let's move on to the next part of the query.

2. Restore Script
Here we use a cursor. I personally do not like them but for this time I thought it will be easy to use a cursor. I will try in future to write it without a cursor.

DECLARE @dbname sysname, @backpath nvarchar(260)
DECLARE rest_cur CURSOR FOR
SELECT database_name FROM #backup_tab ORDER BY 1

OPEN rest_cur
FETCH NEXT FROM rest_cur INTO @dbname
WHILE @@FETCH_STATUS=0
BEGIN
SELECT @backpath=physical_device_name FROM #backup_tab WHERE database_name=@dbname
SELECT @res_script=''

SELECT @res_script= 'RESTORE database [' + @dbname + '] from disk ='+'''' + @backpath+''''

SELECT  @res_script=@res_script+ ' WITH '

SELECT @res_script=@res_script+' move '''+name+''' ' + 'to ''' +physical_name+''','
FROM sys.master_files
WHERE 1=1
AND database_id=DB_ID(@dbname)

SELECT @res_script=@res_script+ 'stats=1 ,replace'

print @res_script
--EXEC(@res_script)
FETCH NEXT FROM rest_cur INTO @dbname

END
CLOSE rest_cur
DEALLOCATE rest_cur

DROP TABLE #backup_tab

It's one of my hot favourite scripts in a disaster recovery scenario. Hope you guys liked it.

Here is the link to the script file


Thanks,
Tushar

Recovering Master Database with Master and Data log file

Hi ,
There are times when you will need to fix a corrupt database server and you will need to restore your master database from a good old backup.  This situation is fairly documented and does not need further elaboration with so many articles floating around from God's of SQL Server to help us. 

But I found another situation which was new and different from the other situations. 
I will start with giving you the configuration path so it will help to imagine the pain and progress. 

OS Installation - C:\ Drive 
SQL Installation - D:\ Drive 
SQL User and Data Log file - E:\ Drive (I know please do not shout it's our dev and I did not build it)
SQL Backup - J:\ Drive

SQL Server version - SQL Server 2016 upgraded from SQL Server 2012. 

Ok now that we have base line set let me show you the scenario. 

The OS crashed aka C drive and the backup crashed aka J:\ Drive. 

So we had to get the OS rebuilt to get something up and running. 
SQL was definitely not up because the OS had no clue there was any installation. 

So we had a discussion with the Infra Team that they will install SQL 2012 and then upgrade 2016 and bring it up to the same patch level as before the disaster and then we will try and re-attach the old system database files to bring up the system.

But as I have mentioned before things do not get fixed so easily once they are messed up in DB World. Specially when some one is blogging about it. 

So when I got back the system I saw that the super smart Infra Guy had installed 2016 with the latest patch directly rather than the upgrade path and he improvised on choosing the new path for the system databases to be E:\ drive rather than the default D:\ Drive and he was very proud that he had fixed a glitch with the original install plan. 


Only if he could understand my pain but let me help you understand by elaborating a little on this. 

1. The old system database files were in D Drive when the disaster happened so if we put them in the new path and start the server it will not come up with the missing address link from old (in master database ) to new path . 
2. If we put the old system databases in the old path and try to start the SQL Server then the resource database will happen address issues. 

So I devised an interim plan to first modify all the existing system databases to old database file path in D:\ Drive so the resource database will be updated. Second we will shutdown the database server and attach the old database files to bring up the system. 

Well after attaching the old database files and restarting the SQL Server the System resumed and all the databases and other objects came online. 

This is how we potentially avoided the need to redo all the login , packages and jobs from prod to dev and saved weeks of development effort. 


Thanks,
Tushar Kanti 

Friday, April 3, 2020

Removing SQL Databases from Recovery Pending Status

Hi Guys,
Today met with an unusual situation where a couple of drives in the Dev box crashed. Luckily the system databases were not affected. So cooked up a script to generate the restore of the user databases from the last good backup. Should be a piece of cake I assumed with the restore script to my rescue. 


Sample restore Script. 
--------------------------------------------------------------------------------
DECLARE @dbname sysname ='dba'


SELECT 'RESTORE database [' + @dbname + '] from disk =' AS 'restore script'
UNION ALL 
SELECT  'WITH ' AS 'restore script'
UNION all
SELECT 

'move '''+name+''' ' + 'to ''' +physical_name+''',' AS 'restore script'
FROM sys.master_files
WHERE 1=1
AND database_id=DB_ID(@dbname)
UNION all
SELECT 'stats=1,replace ' AS 'restore script'
-----------------------------------------------------------------------------------------

Well those of you thinking this was a happily ever after story you are almost wrong. Things do not turn good so easily when they mess up in DB world. 

To my surprise I was not able to a. restore with overwrite d. alter database to single user mode c. take db offline d. drop the database. And as you might have guessed the reason by now the database automatically starts when SQL Server restarts. Well I always loved this feature where you do not have to manually start 100s of database manually or through a script every time you restart SQL Server. But just for this time I was aching to have that feature where I could actually stop the recovery of the databases I do not want. I finally got to the answer through a lot of soul searching and blog posts (Amit Bansal's website ).

So here goes the list of my attempts 
1. Take the SQL Server in single user mode and try to take the user databases offline - Did not work 
2. Take the SQL Server in single user mode and try to drop the user databases - Did not work
3.  Take the SQL Server in minimal  mode and try to take the user databases offline - Did not work 
4. Take the SQL Server in minimal  mode and try to drop the user databases - Did not work
5. Set Trace 3607 in SQL Server Startup and try to take the user databases offline - Did not work 
6. Set Trace 3607 in SQL Server Startup and try to  drop the user databases - Did not work

Then finally I hit gold and tried to set the Trace 3608 in the SQL Server Startup and started the SQL Server. 
Most of the user databases showed online though some were in recovery or recovery pending. 
I dropped the databases which were in online status and restarted the SQL Server  with the trace 3608 in the the startup parameter to bring the other user databases online and then dropped them. 
I finally removed the trace flag 3608 and restarted the SQL Server and all the user databases were gone. 

So all I had to do now was to use the restore script to generate the restore database command for all databases. Well I swung a cursor around the script to get everything in one go but later on that in the next blog post. 


Thanks,
Tushar Kanti

Drop SQL Server databases in Recovery Mode

Hi Guys,
Today met with an unusual situation where a couple of drives in the Dev box crashed. Luckily the system databases were not affected. So cooked up a script to generate the restore of the user databases from the last good backup. Should be a piece of cake I assumed with the restore script to my rescue. 


Sample restore Script. 
--------------------------------------------------------------------------------
DECLARE @dbname sysname ='dba'


SELECT 'RESTORE database [' + @dbname + '] from disk =' AS 'restore script'
UNION ALL 
SELECT  'WITH ' AS 'restore script'
UNION all
SELECT 

'move '''+name+''' ' + 'to ''' +physical_name+''',' AS 'restore script'
FROM sys.master_files
WHERE 1=1
AND database_id=DB_ID(@dbname)
UNION all
SELECT 'stats=1,replace ' AS 'restore script'
-----------------------------------------------------------------------------------------

Well those of you thinking this was a happily ever after story you are almost wrong. Things do not turn good so easily when they mess up in DB world. 

To my surprise I was not able to a. restore with overwrite d. alter database to single user mode c. take db offline d. drop the database. And as you might have guessed the reason by now the database automatically starts when SQL Server restarts. Well I always loved this feature where you do not have to manually start 100s of database manually or through a script every time you restart SQL Server. But just for this time I was aching to have that feature where I could actually stop the recovery of the databases I do not want. I finally got to the answer through a lot of soul searching and blog posts (Amit Bansal's website ).

So here goes the list of my attempts 
1. Take the SQL Server in single user mode and try to take the user databases offline - Did not work 
2. Take the SQL Server in single user mode and try to drop the user databases - Did not work
3.  Take the SQL Server in minimal  mode and try to take the user databases offline - Did not work 
4. Take the SQL Server in minimal  mode and try to drop the user databases - Did not work
5. Set Trace 3607 in SQL Server Startup and try to take the user databases offline - Did not work 
6. Set Trace 3607 in SQL Server Startup and try to  drop the user databases - Did not work

Then finally I hit gold and tried to set the Trace 3608 in the SQL Server Startup and started the SQL Server. 
Most of the user databases showed online though some were in recovery or recovery pending. 
I dropped the databases which were in online status and restarted the SQL Server  with the trace 3608 in the the startup parameter to bring the other user databases online and then dropped them. 
I finally removed the trace flag 3608 and restarted the SQL Server and all the user databases were gone. 

So all I had to do now was to use the restore script to generate the restore database command for all databases. Well I swung a cursor around the script to get everything in one go but later on that in the next blog post. 


Thanks,
Tushar Kanti

Thursday, February 20, 2020

Modern Day Data Warehouse


Modern Day Data Warehouse

Data is the new Oil a term often coined these days whenever we talk in context of Data Analytics.
Data Analysis is a process of reducing the large number of collected data to make a sense of them. Business would not be willing to look at the whole spreadsheet or entire column of numbers. It is an exhausting and tedious job to look at those numbers or spreadsheet.
Exploratory Data analysis gives techniques to devise as an aid in this situation. Most of these techniques work in part by hiding certain aspects of the data while making other aspects clear. This is understanding the key performance indicators which in turn helps Business take the "Decision".

The traditional ways of doing analytics over a historic data warehouse is now going out of fashion and out of context. People want the analysis on the live data and understand the trends of growth to take business decisions. The new way of analyzing the data is not as straight forward as it used to be which used to depend on traditional sources like ERP , CRM and LOB applications. People now want their decisions to be based on a more complicated approach which involves the data generated in and around the products right from social networking to their in house surveys to all connected devices.

To solve this puzzle of data warehousing we need to do some ground breaking changes. According to Gartner reports  “Data warehousing has reached the most significant tipping point since its inception. The biggest, possibly most elaborate data management system in IT is changing.” To achieve this the traditional data warehouse has to evolve and take advantage of big data in real time. So we need a modern data warehouse for our present puzzle.

In theory the traditional data warehouse was designed to be a central repository for all the data in an organization. The data from the transactional systems like CRM, ERP and LOB was deduplicated, cleansed and loaded (ETL) into a relational schema. The reports were based on the batches that used to run on top of that data and analytics was based on these kind of reports.
Traditional Data Warehouse System



The new trends of data analytics like high data volume, real-time data, types of data and new deployment models in cloud and hybrid are putting the traditional warehouse into pressure. So the need of advanced analytics and machine learning are coming into the picture. These trends are forcing enterprise to identify approaches to evolve their existing systems to modern data warehouse.

High Data Volume which is flowing from all Social Network Media and IOT in addition to the existing business data has caused the traditional data warehouse based on SMP to fail catastrophically. They are not able to aid business to predict the correct business decisions. The basic architecture of this design is incapable to meet scale out or scale in demand. The vertical scaling approach is not a viable return on investment scenario.
Real time Data is the demand of the Business Analysts in contrary to traditional data warehouse where analysis was done on historical sanitized data. The velocity of the data captured is ever increasing and the organizations are using this real time data not only to build and optimize their business but also to transact and engage in dynamic event driven processes like market forecasting.

New sources and types of data have come into the picture of Analytics like mobile and social media channel, scanners, sensors, RFID tags, devices, feeds and sources outside of business.  These data types are not easily digestible by traditional data warehouse and do not fit in the business schema model directly. These data types have a lot of potential to benefit the business in optimizing the operations and predictions.

New deployment models in cloud and in hybrid has hit the industry and analytics is not untouched by the presence of these. The trend is the organizations is to invest in Big Data for the analytics and most of the infrastructure for the same is chosen in cloud. The cloud gives the infrastructure the cost effectiveness and the scalability to meet the new demands of the organizations. This only means that a lot of data is also "cloud-born" such as clickstreams, videos , social feeds, GPS, market and weather.
Various Organizations are implementing advanced analytics and predictive analysis to understand what may happen in the near future of their business from varied set of data sources and types. The traditional data warehouse was not designed these types of new analytics which is inductive or bottoms up in nature. Unlike working through a defined set of schema and data collection from requirement based model of traditional ware house advanced analytics  and data science uses experimentation approach to explore answers to vague questions or non-existent questions. This requires a lot of experimentation with the data sets before a schema can be created allowing the data to make sense to business.
Organization should be ready to look into the new modern data warehouse approach when they see the below phenomenon
·       Traditional data warehouse is not able to keep up with the volume surge in the data
·       To do better business they need to look into new data sources with real time data
·       Vertical scaling is not helping
·       Platform cost is not in coherence with ROI
·       Sluggish performance


Modern Data Warehouse lets you choose from a variety of techniques like business intelligence and advanced analytics from vivid types of data sets that are exploding in volume while making the insights real time with an ability to deliver correct data at precise time.

A modern data warehouse is capable of  delivering an exhaustive logical data and analytical platform with a complete set of fully supported, solutions and technologies that can cater to demand of the most sophisticated and oppressive modern enterprise—on-premises, in the cloud, or within any hybrid scenario.
Modern Day Data Warehouse System

Things are changing fast and we need to adapt ourselves to the new as much as we know it we need to follow the same now. Or as I say it like now now. 

Thanks,
Tushar Kanti





Tuesday, February 25, 2014

Contained Databases is GooooD

Hi Guys,
Let's check out the contained database feature today. The contained database user's can be based on both sql and windows based. They need not be mapped to any login and the best part which I liked is that the application user will have absolutely no idea about the server it's running on.

Creating a Contained Database

To start with the Contained database you will have to first configure the Contained database option.

exec spconfigure 'contained database authentication',1
go
reconfigure
go

Now we will be ready to create a partial contained database (No Clue why they call it partially contained). So let's fire the create database command.

Create database Contained
containment =partial

This will create a database for us which will be contained. Now lets go ahead and create an object (i.e a table) in this database and put in some values.

USE [COntained]gocreate table conTab(name char(100))
insert into contab values ('tushar'), ('kusum'),('swati'), ('shashank')
go Now lets go ahead and create a windows user for this database.
USE [COntained]goCREATE USER [eapac\etuskanadm] WITH DEFAULT_SCHEMA=[dbo]goexec sp_addrolemember 'db_owner','eapac\etuskanadm'
go
 Now this user will be comtained in the database itself as it is not mapped to any login and this user will have no server level information. The below image shows how the connection to the Server/Database will look like. Here while connecting using ssms you have to addtionally go to the options in the ssms and there type the database name.




























You can also connect through command line you have to use the -d paramter in the connection to connect to the database.

sqlcmd -E -Slocalhost -dCOntained

Thanks,
Tushar Kanti