Sunday 18 September 2016

How to start your carrier as a Database Administrator ?



First, Think First…..
During your academic finals, get Internships first, even at the small firm will be OK. Request your employers for any kind of database related work, or work as a SQL Developer. Don’t restrict yourself with any one kind of database like MySQL, MSSQL, Oracle, MongoDB. Whatever you get, Just go for it.
Now impress your bosses to convert your internships to full time job. Even if the company wants you to be a Windows admin or Network admin first, just take the job. Don’t ever neglect any opportunity. Now, You can slowly swivel to DBA career.

Conduct Informational Interviews with people you know or your friends  know. Join Communities on Facebook, Quora and OTN.  Discover a WhatsApp group for DBA’s and stay connected with the technology. You will meet new people. These people could be working for companies that you may be interested in. You may get opening/vacancies information. Find out what kind of databases they use. Get specific. Find out the database versions and what their daily tasks and challenges are. Try to add value to them. Just drop in every now and then and present your findings.

It may be really small. But once people start seeing that you can add value, you will be hired. Think hard. You will definitely find some people you can talk to.

During learning, always spend more time understanding the fundamentals, Once you  gets through the basics, the content or theory, later part would be a piece of cake for you now. So I suggest you to learn the basics and try to revise everything daily, example, Oracle Architecture.
Oracle Architecture is very vast and even the core components still remain the same, but the Oracle’s architecture has evolved on every version. When you become an Oracle DBA, one of your main responsibilities will be monitoring production databases.

During Technical Interview, there is 99.99 % probability that Interview panel will ask you to Explain Oracle Architecture.  So, be well prepared for this and explore every ‘nook and hook’ of Oracle Architecture and make a note and write the architecture in your own words. Revise it daily.
Because once you explain the architecture properly, there is 60 % chances you are selected, because rest is depend on your technical answers.
Do not explain architecture in rush. Never do this mistake. I saw many candidates, they just finish explaining the architecture in 5-7 minutes. This is Bad.
You should explain the Architecture for at least 25-30 minutes. So that the interviewer get impressed.


Further part of Interview preparation is written in my next article..

General Tips for DBA Interviews

Saturday 17 September 2016

What a Junior DBA will do, When he Faces “Slowness in the Database” ?



Many a times, a DBA gets a call from Client side regarding the slowness of the Database.

In these case, fact the database is never slow or fast in most of the case session connected to the database slow down when they receives unexpected hit. Thus to solve this issue you need to find those unexpected hit.

There are few checks a Junior DBA can perform to detect & Troubleshoot Slowness

1. Taking user inputs:
·        Is application is slow or any particular batch processing is slow?
·        Slowness is observed through out the system or only few or one user
·        Is it happening in some particular timing ?
By collecting these information we will get an outline of what needs to be checked.


2. Check for any lock contention
You can use the below query for this.
SQL> select count(*) from v$lock where block=1;
·        If count is greater than one, lock is there in database.
·        Check with application team and release the blocking sessions

3. Locking is not only the cause to effects the performance. Disk I/O contention is another case.
·        When a session retrieves data from the database datafiles on disk to the buffer cache, it has to wait until the disk sends the data. The wait event shows up for the session as "db file sequential read" (for index scan) or "db file scattered read" (for full table scan).When you see the event, you know that the session is waiting for I/O from the disk to complete.

·        To improve session performance, you have to reduce that waiting period. The exact step depends on specific situation, but the first technique “reducing the number of blocks retrieved by a SQL statement” almost always works.

·        Reduce the number of blocks retrieved by the SQL statement. Examine the SQL statement to see if it is doing a full-table scan when it should be using an index, if it is using a wrong index, or if it can be rewritten to reduce the amount of data it retrieves.


4. Check the resource utilization:
You can check the following Areas :
·        CPU utilisation
·        Load
·        Memory utilization

ü  Run TOP command in Linux to check CPU usage.
ü  Check  any single process is holding the CPU for long time -- note that process ID.
            Press 'c' in top command, it will give you the time and process which is   consuming more CPU.
ü  Run VMSTAT, SAR, PRSTAT command to get more information on CPU, memory usage and possible blocking.



5. Check the alert log:
Check the alert log. You can check how many log switches are happening in one hour.
If you have more than 5 archives per hour we can say you may need to increase the redo log size. Troubleshoot the errors if it is critical or related to performance.

6. Server side checks
Check for the Memory, Paging, I/O utilisation from server side.
Paging and memory can be checked by ‘top’ command and ‘iostat’ will do the I/O statistics.
Contact the concern team for any abnormality if you see in this.

Advance performance tuning is not the scope of this blog, this is for junior DBA. Hope it helps.

Expecting your comments...


Source : ArunSankar Blog



Find more about Tuning Click Here



Saturday 10 September 2016

Datapump Architecture. What is Master table in Datapump ?



Datapump Architecture. What is Master table in Datapump ?


Master Table :

The Master Table is created in the schema of the current user running the Pump Dump export or import, and it keeps tracks of lots of detailed information. 

The Master Table is used to track the detailed progress information of a Data Pump job.

This will store the following information :
·         The status of every worker process involved in the operation.
·         The current set of dump files involved.
·         The job’s user-supplied parameters.
·         The state of current job status and restart information.
·         The current state of every object exported or imported and their locations in the dump file set.

Note :  The Master Table is the key to Data Pump’s restart capability in the event of a planned or unplanned job stoppage.

Behaviour of Master Table :
This table is created at the beginning of a Data Pump operation and is dropped at the end of the successful completion of a Data Pump operation. The Master Table can also be dropped if the job is killed using the kill_job interactive command. If a job is stopped using the stop_job interactive command or if the job is terminated unexpectedly, the Master Table will be retained. 

The keep_master parameter can be set to Y to retain the Master Table at the end of a successful job for debugging purposes


The name of the Master Table is the same as the Data Pump job name and has the following columns:

SQL>  Desc <job_name> ;



Process in Datapump Architecture


The master control process
·         Maintains job state, job description, restart, and dump file set information in the Master Table.
·         This process controls the execution and sequencing of a Data Pump job.
·         The master process has two main functions
1.       To divide the loading and unloading of data and metadata tasks and handle the worker processes;
2.       To manage the information in the Master Table and record job activities in the log file.


Worker Process:
·         This handles the request assigned by the master control process.  This process maintains the current status of the job, like : ‘pending’ or ‘completed’ or ‘failed’.
·         The worker process is responsible for loading and unloading data and metadata.
·         The number of worker processes needed can be defined by assigning a number to the parallel parameter.



Parallel Query Process:
·         This process is used when the Data Pump chooses External Table API as the data access method for loading and unloading data. 
·         The worker process that uses the External Table API creates multiple parallel query processes for data movement, with the worker process acting as the query coordinator.


Shadow Process :
·         This process is created when a client logs into the Oracle server. 
·         The shadow process creates a job, which primarily consists of creating the Master Table, creating the queues in Advanced Queues (AQ) used for communication among the various processes, and creating the master control process.
·         Once a job is running, the shadow process’ main job is to check the job status for the client process.  If the client process detaches, the shadow process goes away; however, the remaining Data Pump job processes are still active.
·         Another client process can create a new shadow process and attach to the existing job.

Wednesday 7 September 2016

What is the biggest lesson you have learned in the corporate world?



Source:  QUORA



  1. Never be the smartest person in the room.
  2. Always listen to your inner voice.  If it "feels wrong" it is.
  3. Debt makes you and your company a slave to someone.
  4. It is never wrong to do the right thing.  It is never right to do the wrong thing.  The ends DO NOT justify the means.   The Ten Commandments are also great safety tips no matter who said them.
  5. Don't hire employees or friends.  Hire people who believe in the vision and want to partner because it makes them feel alive to be apart of something bigger than them or us.
  6. Remember who you wanted to be when you were a kid.  You didn't EVER want to be the dream killer, paper pusher or suit.  You wanted to change the world.  Why aren't you changing the world?  Age is a mind set and the older you are mentally, the less dynamic change you will capture and create.
  7. Love your people. If your end users are viewed as people who are "clicks"  or just customers you will fail.   If you care about them, you will make the product that will actually make their life better or easier.  You both win. Most companies are upside down.  Prioritize:
    A) principals
    B) people (love and respect your end users/vendors and employees/partners
    -- If you have those as your solid foundation and you protect those two things, the next two fall into place naturally.
    C) product. If you know the first two the product becomes obvious
    D) profit.  It is a byproduct.  Never forget that. Making profit your goal is the easiest way to become all that you despise.
  8. Make your word you bond.  Contracts are important but your word is more so.  Let your yes mean yes and no mean no always.  The spirit of the deal is just as important as the words of the deal.  Never dishonor yourself.
  9. Plan on being mocked or told it will never work.  Sometimes they are right, but the best ideas do seem crazy at first to most or everyone would already be doing it.
  10. Know not what you do or make.  But why.  That is the difference between a guy who makes computers and Steve jobs.  He knew why.  ART.
  11. With God all things are possible.
  12. There is no such thing as compartmentalizing ethics.  You can not cheat on your wife or taxes but be honest and trustworthy at the office.   Fix your character flaws.  We all have them.   Master them or they will master you.
  13. Make your weakness your strength        SOURCE : https://www.quora.com/What-is-the-biggest-lesson-you-have-learned-in-the-corporate-world