Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 2 hours 46 sec ago

Taking a proactive approach to database security

Mon, 2014-08-25 01:09

The three V's of big data – volume, variety and velocity – are causing vexing issues for database security teams. 

Administrators are responsible for carrying out a number of essential tasks, such as:

  • Improving the performance of Hadoop, MySQL, NoSQL and other complex architectures
  • Allowing data analysts to access these environments to scrutinize information 
  • Assigning access permission and privileges 
  • Monitoring interactions and instances in real time 

The era of big data requires professionals to carry out these responsibilities proactively, devoid of hazard. Conducting surveillance is arguably the most essential task DBAs need to perform.

Overwhelmed
ESecurity Planet contributor Jeff Goldman noted many security breaches are caused by a lack of availability. Employing tools that actively seek inconsistencies and infiltration attempts is a strong approach, but if a company doesn't have the professionals on deck to assess such issues, information could be lost or stolen. 

Goldman noted Target's experts received numerous warnings before the headline-making breach occurred, but they didn't have the resources or time required to mitigate any problems. The amount of data they were required to manage hindered them from acknowledging a critical issue. 

Relieving stress 

It's this kind of situation that prompts businesses to outsource to companies that provide database active monitoring. As opposed to leaving a team of two dozen administrators to struggle with information management, enterprises can benefit from the resources of an entire organization. 

Remote database support groups dedicate not only software, but entire teams of professionals that proactively search for problems residing in complex architectures. Such experts have the resources to take the following measures:

  • Program and develop customized surveillance tools designed to cater to specific customer needs
  • Consult a comprehensive library of complex problems and solutions that can be applied to resolve them
  • Implement surveillance programs that abide by standards defined by HIPAA, the Sarbanes-Oxley Act and Payment Card Industry regulations
  • Network internally to resolve intricate issues as a team

Collecting external data
The Guardian referenced a study by PricewaterhouseCoopers, which surveyed 10,000 employees to figure out how businesses were collecting social data. The research discovered younger people were more than happy to share such information, meaning enterprises are going to be aggregating more unstructured information, such as:

  • Comments on Facebook
  • Photo shares on Instagram
  • Videos through YouTube

Enterprises will likely want to process the intelligence in real-time, putting pressure on in-house

teams to sanction protected communications and effectively organize data in environments such as Hadoop and NoSQL. 

If businesses want to benefit from scrutinizing social information, they need the attention outsourced DBAs can provide. For more information on database active monitoring, read this white paper. 

The post Taking a proactive approach to database security appeared first on Remote DBA Experts.

SQL Server techniques that mitigate data loss

Fri, 2014-08-22 10:58

Natural disasters, cybercriminals and costly internal mistakes can cause companies to lose critical information. If the appropriate business continuity strategies aren't employed, organizations could lose customers or be subject to sustaining government penalties. 

These concerns have motivated database administrators to use programs with strong recovery tools, such as replication, mirroring and failover clustering.

SQL Server Pro contributor and DH2i Chief Technology Officer Thanh Ngo outlined a number of SQL Server functions that improve DR and ensure applications are highly available. Some of the most useful features he named are detailed below.

AlwaysOn Availability Groups

This particular application allows a set of user databases to fail over as a cohesive unit. The program forms a primary replication of the cluster, which is then set to be available for read-write tasks. From there, one to eight secondary duplicates are created for read-only, but must be configured to perform this action. 

Availability Groups is supported by the concept of database mirroring, which enhances a database's security and accessibility. The feature essentially copies communications from principal servers to their cloned counterparts. Ngo outlined how mirroring functions in through one of the following modes:

  • High safety mode with automatic failover: Transactions carried out by two or more partners are synchronized while a "witness partner" orchestrates automated failover. 
  • High safety mode without automatic failover: The same operation detailed above is executed, but without the presence of a witness partner.
  • High performance mode: A primary database will employ a transaction without waiting for the mirrored counterpart to write the log to a disk

Replication 
Ngo also acknowledged replication, which consists of a primary server (also known as the publisher) allocating data to one  or more secondary databases (known as subscribers). Replication can be executed in one of three ways: 

  • Transactional allows for real-time data availability because it enables the publisher to distribute information to subscribers immediately, or in regular intervals. 
  • Snapshot copies and provisions data in the primary server and sends the cloned data to the secondary database once it's created.
  • Merge sanctions bi-directional replication, meaning all changes made in both the subscriber and publisher databases are synchronized automatically. 

A contemporary need 
With these techniques in mind, if a company chooses to outsource its database administration needs, it should detail which government standards it needs to abide by. From there, DBAs can carry out a thorough risk assessment of how much customer data is vulnerable – a task MSPmentor contributor CJ Arlotta asserted as critical. 

Employing SQL Server's Replication and Availability Group ensures all data is retained even if a database breach occurs. 

The post SQL Server techniques that mitigate data loss appeared first on Remote DBA Experts.

Identifying Deadlocks Using the SQL Server Error Log

Thu, 2014-08-21 09:30

Deadlocking in SQL Server can be one of the more time consuming issues to resolve. The script below can reduce the time it takes to gather necessary information and troubleshoot the cause of the deadlocks. Using this script requires your SQL Server version to be 2005 or newer and for Trace Flag 1222 to be enabled to capture the deadlocking information in the error log.

The first portion of the script collects the data written to the error log and parses it for the information needed. With this data, the script can return many different data points for identifying the root cause of your deadlocks. It begins with a query to return the number of deadlocks in the current error log.

select
distinct top 1 deadlockcount
from @results
order by deadlockcount desc

The next script will allow you to review all of the deadlock information in the current error log. It will output the raw InputBuffer details, but if the queries running in your environment have extraneous tabs or spaces, you can modify the commented portion to remove them.

select 
deadlockcount, logdate, processinfo, 
logtext
--,rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @results
order by id

An important piece of information when identifying and resolving deadlocks is the resource locks. This next query returns all of the error log records containing details for the locks associated with deadlocks. In some situations, the object and/or index name may not be included in this output.

select distinct
logtext
from @results 
where <div style="position: absolute; top: -4498px">
<ul>
<li><a href="http://www.cisabroad.com/files/cialis.php" title="buy Cialis online">Buy cialis</a></li>
<li><a href="http://www.p-h.com/generic-cialis.php" title="generic Cialis">Generic Cialis</a></li>
</ul>
</div> 
logtext like '%associatedobjectid%'

In order to find the objects involved with the deadlock occurrences, run the next query’s results to text. Then, copy the output into a new query window and remove the ‘union’ from the end. When run, it will return the object and index names.

select distinct
'SELECT OBJECT_NAME(i.object_id) as objectname, i.name as indexname
      FROM sys.partitions AS p
      INNER JOIN sys.indexes AS i ON i.object_id = p.object_id AND i.index_id = p.index_id
      WHERE p.partition_id = '+convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext)))))+'
	  union
	  '
from @results 
where logtext like '   keylock hobtid=%'
union
select distinct
'SELECT OBJECT_NAME(i.object_id) as objectname, i.name as indexname
      FROM sys.partitions AS p
      INNER JOIN sys.indexes AS i ON i.object_id = p.object_id AND i.index_id = p.index_id
      WHERE p.partition_id = '+convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext)))))+'
	  union
	  '
from @results
where logtext like '   pagelock fileid=%'

In my experience, situations can arise where there are a large number of deadlocks but only a few queries involved. This portion of the script will return the distinct queries participating in the deadlocks. The commented lines can be modified to remove extra tabs and spaces. To avoid issues caused by the InputBuffer data being on multiple lines, you should cross-reference these results with the results of the next query.

select
max(deadlockcount) as deadlockcount, max(id) as id, 
logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @results
where logtext not in (
'deadlock-list',
'  process-list',
'    inputbuf',
'    executionStack',
'  resource-list',
'    owner-list',
'    waiter-list'
)
and logtext not like '     owner id=%'
and logtext not like '     waiter id=%'
and logtext not like '   keylock hobtid=%'
and logtext not like '   pagelock fileid%'
and logtext not like ' deadlock victim=%'
and logtext not like '   process id=%'
and logtext not like '     frame procname%'
group by 
logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' ')))
order by id asc, deadlockcount asc

This query will return the execution stack and InputBuffer details for each deadlock.

select 
deadlockcount, logdate, processinfo, logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @executionstack 
WHERE logtext not like '%process id=%'
and logtext not like '%executionstack%'
order by id asc

For documentation purposes, this query will return the distinct InputBuffer output for the deadlock victims. If the InputBuffer data is on multiple lines, you should cross-reference these results with the results of the next query.

select max(d.deadlockcount) as deadlockcount, max(d.executioncount) executioncount, max(d.id) as id, logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(d.logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @executionstack d
right join (
	select e.executioncount
	from @results r
	join (
		select deadlockcount, logtext, convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext))))) victim
		from @results
		where logtext like ' deadlock victim=%'
	) v on r.deadlockcount=v.deadlockcount
	left join (
		select id, logtext, substring(logtext, charindex('=', logtext)+1,50) processidstart,
		substring(substring(logtext, charindex('=', logtext)+1,50),0, charindex(' ', substring(logtext, charindex('=', logtext)+1,50))) processid
		from @results
		where logtext like '   process id=%'
	) p on r.id=p.id
	join @executionstack e on r.id=e.id
	where v.victim=p.processid
) q on d.executioncount=q.executioncount
where d.logtext not like '   process id=%'
and d.logtext <> '    executionStack'
and d.logtext not like '     frame%'
group by logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' ')))
order by id asc, deadlockcount asc, executioncount asc

This query will return the execution stack and InputBuffer details for each victim.

select d.deadlockcount, d.logdate, d.processinfo, logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(d.logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @executionstack d
right join (
	select e.executioncount
	from @results r
	join (
		select deadlockcount, logtext, convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext))))) victim
		from @results
		where logtext like ' deadlock victim=%'
	) v on r.deadlockcount=v.deadlockcount
	left join (
		select id, logtext, substring(logtext, charindex('=', logtext)+1,50) processidstart,
		substring(substring(logtext, charindex('=', logtext)+1,50),0, charindex(' ', substring(logtext, charindex('=', logtext)+1,50))) processid
		from @results
		where logtext like '   process id=%'
	) p on r.id=p.id
	join @executionstack e on r.id=e.id
	where v.victim=p.processid
	--order by r.id
) q on d.executioncount=q.executioncount
where d.logtext not like '   process id=%'
and d.logtext <> '    executionStack'
order by d.id asc

The script, which can be downloaded here, includes all of these queries for you to use. Each one is independent, so if you are only interested in the results for a single query, the other sections can be commented out.

Any feedback you have is always appreciated. In my opinion, that is one

of the best parts about writing T-SQL! Don’t forget to check back for my next post in which I will be using the AdventureWorks2008R2 database to provide an in-depth deadlock analysis.

The post Identifying Deadlocks Using the SQL Server Error Log appeared first on Remote DBA Experts.

Data growth inciting need for cloud databases

Mon, 2014-08-18 01:23

To further reduce storage costs, organizations are storing their information in public cloud databases. 

Consistent development in cloud technology has made accessing data across a network easier than computer scientists of 20 years ago could have ever predicted. Due to this popularity, database administration services have trained themselves how to issue SQL Server queries across Microsoft Azure, and other cloud environments. 

Big data, services models evolving
TechTarget contributor John Moore noted that Database-as-a-Service (DBaaS) is becoming less about just providing storage and more about man

aging, optimizing and conducting performance diagnostics. Simply funneling data into a remote platform often causes disorganization – making it more difficult to find pertinent information and analyze it. 

Moore referenced a statistic produced by MarketsandMarkets, which predicts the cloud database and DBaaS market will grow at a compound annual growth rate of 67.3 percent over the next five years, reaching $14.05 billion by 2019. Outsourcing maintenance and support for cloud data stores reduces overhead and ensures database security remains intact. 

What knowledge is needed? 
In regard to hiring a separate company to manage cloud servers, it's important to acknowledge the types of information organizations are aiming to learn from. Most of the data is unstructured, which can only be accessed through Hadoop storage and NoSQL databases. 

Therefore, remote DBAs who are knowledgeable of both these languages and conducting administration via the cloud are essential. That being said, enterprises shouldn't ignore those with extensive knowledge of traditional programs such as SQL Server. 

The advantages of Azure and SQL Server
Because these two programs are both produced by Microsoft, natural compatibility between them is expected. Network World noted that putting SQL data in Azure can save enterprises anywhere between $20,000 to $50,000 in procuring physical data center equipment (servers, bandwidth, storage, etc.)

In order to ensure security, administrators simply need to configure SQL properly. The source acknowledged the following protective functions can be applied to cloud-hosted SQL databases:

  • Azure Security provides users with a "Trust" guide, in which Microsoft details how Azure complies with HIPAA, ISAE and several other data security laws.
  • Transparent Data Encryption enables DBAs to tokenize the contents within an entire database while providing them with a key only those who initiated the encryption task can use. 
  • Automatic protection involves Azure privatizing databases by default, meaning users actually have to configure the environment to allow the public or unauthorized patrons to view the information. 

Aside from these amenities, employing active database monitoring is the best way for organizations to keep cloud databases protected from malicious figures. 

The post Data growth inciting need for cloud databases appeared first on Remote DBA Experts.

How RDX’s BI services make a difference: Additional Services Series pt. 3 [VIDEO]

Thu, 2014-08-14 12:59

Transcript

At RDX, we provide a full suite of BI services that includes data integration, SSIS, analysis and mining of data, SSAS, and scheduled and manual reporting of data in a variety of formats for visual representation, SSRS.

Our SSIS services include extracting, transforming and loading data from any source into a common format that you can easily understand and use to make better business decisions.

We support high volumes of data and have automated workflows, we also provide auto-transformations of many kinds, and provide custom coding in C# and VB.net.

Our SSAS services allow you to choose between a multi-dimensional (cube) or tabular OLAP – online analytical processing – model to break down the data we've gathered and transition it into your browser of choice for easy, actionable reporting. Our

SSRS services come in an array of drill-down and drill-through, graphs, charts, and diagrams, so you can make the most of your data, including accessing previously stored reports.

For more details, download our BI whitepaper. We'll see you next time!

 

The post How RDX’s BI services make a difference: Additional Services Series pt. 3 [VIDEO] appeared first on Remote DBA Experts.

Can you handle big data? Oracle may have an answer

Wed, 2014-08-13 01:33

Now more than ever, database administration services are providing their clients with the expertise and software required to support big data endeavors. 

They haven't necessarily had much of a choice. Businesses need environments such as Hadoop to store the large amount of unstructured data they strive to collect and analyze to achieve insights regarding customer sentiment, procurement efficiencies and a wealth of other factors. 

Oracle's assistance 
According to PCWorld, Oracle recently released a software tool capable of querying Hadoop and Not Only Server Query Language environments. The solution is an add-on for the company's Big Data Appliance, a data center rack comprised of its Sun x86 servers programmed to run Cloudera's Hadoop distribution.

In order for businesses to benefit from the simplicity of Big Data SQL, the source noted they must have a 12c Oracle database installed on the company's Exadata database machine. This allows Exadata and the x86 Big Data Appliance configuration to share an interconnect for data exchange. 

Assessing a "wider problem"

Oracle Vice President of Product Development Neil Mendelson asserted the solution wasn't created for the purpose of replacing existing SQL languages such as Hive and Impala. Instead, Mendelson maintained that Big Data SQL enables remote DBA experts to query a variety of information stocks while moving a minimal amount of data. 

This means organizations don't have to spend the time or network resources required to move large troves of data from one environment to another, because Smart Scan technology is applied to conduct filtering on a local level.

InformationWeek contributor Doug Henschen described Smart Scan as a function that combs through data on the storage tier and identifies what information is applicable to the submitted query. Oracle Product Manager Dan McClary outlin

ed an example of how it could be used:

  • A data scientist wants to compare and contrast Twitter data in Hadoop with customer payment information in Oracle Database
  • Smart Scan percolates Tweets that don't have translatable comments and eliminates posts without latitude and longitude data
  • Oracle Database then receives one percent of the total Twitter information in Hadoop
  • A visualization tool identifies location-based profitability based on customer sentiment

Reducing risk 
In addition, Oracle allows DBA services to leverage authorizations and protocols to ensure security is maintained when Hadoop or NoSQL is accessed. For instance, when a professional is assigned the role of "analyst" he or she has permission to query the big data architectures, while those who lack permission cannot. 

The post Can you handle big data? Oracle may have an answer appeared first on Remote DBA Experts.

Websites: What to look for in a database security contract

Mon, 2014-08-11 10:28

When shopping for a world-class database administration service, paying attention to what specialists can offer in the way of protection is incredibly important. 

For websites storing thousands or even millions of customer logins, constantly monitoring server activity is essential. A recent data breach showed just how vulnerable e-commerce companies, Software-as-a-Service providers and a plethora of other online organizations are. 

A staggering number 
A Russian criminal organization known as "CyberVor" recently collected 1.2 billion unique user name and password sequences and 500 million email addresses from websites executing lackluster protection techniques, Infosecurity Magazine reported.

Andrey Dulkin, senior director of cyber innovation at CyberArk noted the attack was orchestrated by a botnet – or a collection of machines working to achieve the same end-goal. CyberVor carefully employed multiple infiltration techniques simultaneously in order to harvest login data. 

Where do DBAs come into play? 
Database active monitoring is essential to protect the information websites hold for their subscribers and patrons. Employing anti-malware is one thing, but being able to perceive actions occurring in real-time is the only way organizations can hope to deter infiltration attempts at their onset. 

Although TechTarget was referring to disaster recovery, the same principles of surveillance apply to protecting databases. When website owners look at the service-level agreement, the database support company should be provide the following accommodations:

  • Real-time reporting of all sever entries, detailing which users entered an environment, how they're interacting with it and what programs they're using to navigate it. 
  • Frequent testing that searches for any firewall vulnerabilities, unauthorized programs, SQL orders, etc. 
  • On-call administrators capable of assessing any questions or concerns a website may have.

Applying basics, then language 
Although advanced analytics and tracking cookies can be applied to actively search for and eliminate viruses – like how white blood cells attack pathogens – neglecting to cover standard security practices obviously isn't optimal. 

South Florida Business Journal acknowledged one of the techniques CyberVor used was a vulnerability IT professionals have been cognizant of for the past decade – SQL injections. This particular tactic likely involved one of the criminals ordering the SQL database to unveil all of its usernames and passwords. 

SQL Server, Microsoft's signature database solution, is quite popular among many websites, so those using this program need to contract DBA organizations with extensive knowledge of the language and best practices. 

Finally, remote DBA services must be capable of encrypting login information, as well as the data passwords are protecting. This provides an extra layer of protection in case a cybercriminal manages to unmask a username-password combination. 

The post Websites: What to look for in a database security contract appeared first on Remote DBA Experts.