Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 10 hours 28 min ago

Mozilla Working to Enhance its Security Process [VIDEO]

Tue, 2014-09-16 12:44

Transcript

Welcome back to RDX. A proper test environment should be a regular part of your business' Change Management Process. However, if Personally Identifiable Information (PII) is not removed from the test data, sensitive information could be exposed.

According to eWEEK, Mozilla accidentally exposed critical information in two separate incidents. The most recent was first reported August 27, and left 97,000 developers’ information exposed for approximately 3 months. The landfill.bugzilla.org development system exposed information including email and encrypted passwords. Initial disclosure is thought to have occurred during a database migration with a database dump including user data. Users of this system have been advised to change their passwords.

Mozilla is now revising their test plan to not include database dumps. An additional step businesses can take to protect their PII is to use two-factor authentication for access.

Thanks for watching! 

The post Mozilla Working to Enhance its Security Process [VIDEO] appeared first on Remote DBA Experts.

Industrial-grade cyberattacks plague utilities

Mon, 2014-09-15 08:57

While the retail sector has received much attention lately due to the prevalence of cyberattacks, utilities are also in grave danger of sustaining Web-based aggression.

Database administration services are acknowledging the persistence of Dragonfly, an organization of cybercriminals that focuses on infiltrating the industrial sector. Securing Web-based assets, as well as on-premise architectures, is imperative, as these carefully orchestrated hacks could leave denizens without power indefinitely 

A grievous situation
Warwick Ashford, a contributor to Computer Weekly, noted an instance in which he and his team participated in the Kaspersky Industrial Protection Simulation role-playing game. The scenario presented a situation in which a regional water purification plant was infiltrated, leaving thousands of people without drinking water. In addition, the company sustained incredible losses within a five-week period. 

The culprit? Dragonfly, a co-op based out of Eastern Europe that security firm Symantec has blamed for hacking more than 1,000 North American and European energy enterprises. Dragonfly specializes in disengaging industrial control systems, which manage electrical, water, gas and oil systems. 

Why it's so dangerous
Ashford noted the water treatment company intended to train its IT staff in cybersecurity protocols, but apparently never got around to doing so. After a number of network audits were performed, the company recognized two unidentified Internet connections to the industrial control system. The IT department responded by setting up a number of firewalls. 

However, after the security solutions were implemented, Ashford's team still received notifications of unhealthy chlorine levels in its firm's drinking water. Through phishing scams, the perpetrators were able to manipulate the industrial control system's output and filtration of chlorine. 

While the aforementioned situation was simply a simulation, that doesn't change the fact that Dragonfly is capable of inflicting such damage. Imagine the harm such a terrorist group could cause in this scenario.

Real-world situations 
FierceSmartGrid noted one instance in 2011 in which Dragonfly initiated "Energetic Bear," a kind of attack that leveraged two remote access Trojans. These creations form a kind of bridge between cybercriminals and targeted architectures. Basically, they allowed Dragonfly to insert malware into a U.S. utility's databases. 

The source noted a statement by Symantec, which maintained Dragonfly's initiatives are focused on "extracting and uploading stolen data, installing further malware onto systems and running executable files on infected computers."

It's this particular breed of cybercriminal that has instigated the need for database active monitoring. Maintaining a high level of thorough surveillance at all times is a must, especially for utilities distributing essential resources to consumers.

The post Industrial-grade cyberattacks plague utilities appeared first on Remote DBA Experts.

Database manufacturers include JSON in latest provisions

Tue, 2014-09-09 10:13

JavaScript Object Notation has been lauded as one of the most easy-to-understand programming languages available, and has been a boon to professionals managing Web-based data. 

Database administration services and Web developers alike favor the language when handling complex information, because it's easy for people to read and write, JSON.org noted. Programmers are often fans of its affiliations with conventions found in C, C++, Java, JavaScript, Python and other code versions. JSON is constructed on two foundations:

  • A list of name/value pairs, which is known in other languages as an object
  • An organized list of values, also called an array

Why add JSON support to databases? 
Unstructured data, a type of information that is ubiquitous in the current Digital Age, needs to be stored in documents, which is exactly how JSON manages data. Many NoSQL databases such as MongoDB, Couchbase and Hadoop abide by this protocol, which has made it a favorite among Web developers, InfoWorld noted. 

In order to compete with such architectures, software giant Oracle added a JSON support to the company's Oracle 12c databases, which were outlined at the NoSQL Now conference in San Jose, California last month. This is a break from the conventional relational database management system architecture, but it's presented as an alternative to PostgreSQL, which has been regarded as the open source alternative to Oracle. 

Is it a valid option? 
Still, DBA services may advise their clients to keep using Oracle 12c for tabular data and conventional NoSQL solutions for semi-structured information. InfoWorld acknowledged how the latter contingency abides by a "scale out" protocol as opposed to a "scale up" approach. 

Scaling out enables NoSQL solutions to leverage commodity servers as a way to enhance performance as opposed to bulking up a massive database server. In addition, the way a document-based database allocates information makes companies highly resistant to failure because the data is distributed across multiple servers. 

When will the day come? 
InfoWorld classified modern databases into three types: 

  • RDBMS, which handle structured data
  • NoSQL, which manage semi-structured information
  • Hadoop, which organizes unstructured data

The source proposed an interesting situation, that all three systems be synchronized into a single solution. JSON could potentially provide a structure for just such a database, but it's unknown whether Oracle, IBM or another tech company would be able to successfully develop it (the profits for said enterprise would be huge).

Yet, it's more likely those in the open source community would manufacture a database capable of seamlessly handling structured, semi-structured and unstructured data. Just look at how monumental Hadoop has been. 

The post Database manufacturers include JSON in latest provisions appeared first on Remote DBA Experts.

Getting the Whole DB2 package, Additional Services Series Pt. 8 [VIDEO]

Mon, 2014-09-08 14:09

Transcript

Need to give your databases a boost?

Hi, welcome back to RDX! If your organization's handling large, data-intensive workloads, IBM's DB2 for Linux, Unix and Windows is an attractive alternative.

RDX has worked with DB2 since the beginning, and our DB2 solutions are architected to provide a superior level of DB2 database support. From day-to-day operations to strategic decision making, our DB2 solutions arm customers with the experience, skillsets and best practices required to maximize their critical DB2 environments.

RDX also provides support for IBM’s IMS product set which offers the availability, protection, performance and scalability companies need to process online transactions.

Thanks for watching, and be sure to refer to our company resume for more information on our DB2 and IMS services!
 

The post Getting the Whole DB2 package, Additional Services Series Pt. 8 [VIDEO] appeared first on Remote DBA Experts.

How REST marries unstructured and structured data

Fri, 2014-09-05 10:19

In regard to big data, the majority of the information produced by enterprises and consumers alike is unstructured, meaning architectures such as MongoDB and Hadoop must be utilized.

This has created a schism between structured and unstructured data. Combining the two requires database administration professionals to query relational and non-relational architectures disparately so that analysis tools can consider the information simultaneously.

Using REST APIs
Representational state transfer application programming interfaces offer somewhat of a solution to this problem. For those who are unfamiliar with the technology, REST and API are two separate infrastructures. According to TechTarget, the former operates over HTTP to read specific websites containing Extensible Markup Language (XML) files. In turn, APIs provide a set of protocols and tools for building software applications.

Therefore, it can be deduced that REST APIs are utilized to develop programs capable of reading information on Web pages, but how does this impact a DBA’s ability to combine and process both unstructured and structured data?

First, it’s important to scrutinize what REST is typically used for. TechTarget noted the architecture is used to provide functionality to social networking channels, mobile applications and business process automation. On the surface level, it would appear skillful database experts would program REST APIs to segregate information the same way Hadoop does and organize them into relational databases, but given the availability of the open source project, doing so isn’t necessarily practical.

A shot of Espresso 
One startup in Silicon Valley claims to have found a solution to this conundrum. Database Trends and Applications noted Espresso Logic created a new program that combines big data and SQL information into one cohesive REST API for fabricating the backbone of Web and mobile software.

The REST API provides professionals with real-time access to MongoDB and SQL databases and eliminates the need to replicate data across sources. Security and business logic, two essentials in today’s IT economy, are included as a part of the package.

“This is our attempt to allow companies to join this data together in a common API. People can do that manually but it takes a very long time. This is basically point and click in most cases, and in some cases writing just a few lines of JavaScript code is all it takes,” said Espresso Logic CEO R. Paul Singh, as quoted by the source.

For analysis’s sake, developers could use Espresso Logic’s solution to aggregate and process information much faster than they could before.

The post How REST marries unstructured and structured data appeared first on Remote DBA Experts.

Throw Away Out-of-the-Box Monitoring Solutions, Additional Services Series Pt. 7 [VIDEO]

Fri, 2014-09-05 08:24

Transcript

Welcome to RDX. Today, we're going to talk about how we deploy our non-database server monitoring program. What's a non-DB server? The machines that run your enterprise applications, handle network connections and perform other critical functions.

When customizing server monitoring solutions, we follow a five-step process:

One: Understand each customer’s unique monitoring needs
Two: Identify critical monitoring parameters
Three: Develop thresholds, goals and objectives
Four: Develop and document monitoring, remediation and escalation procedures
Five: Monitor and adjust the strategy as necessary

RDX dedicates an entire team who are responsible for creating, implementing and enhancing a strategic blueprint for the proactive monitoring and trouble-shooting methods required to prevent availability and performance problems before they occur.

Companies employing RDX to support their environments do not have to spend the additional monies required to create a proactive monitoring environment.

Thanks for watching!
 

The post Throw Away Out-of-the-Box Monitoring Solutions, Additional Services Series Pt. 7 [VIDEO] appeared first on Remote DBA Experts.

Access Your Data Sources by Opening the GoldenGate: Additional Services Series Pt. 6 [VIDEO]

Wed, 2014-09-03 12:01

Transcript

In order to analyze your data in real-time, you need an architecture that operates at the same pace.

Hi and welcome to RDX! If you're using multiple database systems to store your information, you know how much of a pain it is to combine all that data.

For this reason, RDX offers expertise and support for GoldenGate. GoldenGate provides data capture, replication, transformations and authorization between heterogeneous sources.

This means information stored in PostgreSQL, IBM DB2, and several other systems can be seamlessly aggregated in a single instance. Capturing and delivering data has never been easier!

Thanks for watching! Be sure to join us next time.

The post Access Your Data Sources by Opening the GoldenGate: Additional Services Series Pt. 6 [VIDEO] appeared first on Remote DBA Experts.

Remote Support for Windows/UNIX/LINUX: Additional Services Series Pt. 5 [VIDEO]

Fri, 2014-08-29 06:17

Transcript

When outsourcing your operating system support, you want to know that you have expert professionals with knowledge of all your platforms handling your data. At RDX, that’s something you don’t have to worry about.

Welcome back to our Additional Services series!

Whether you use Windows, UNIX or LINUX systems, we support anything and everything an admin does onsite remotely. Our Windows OS tech support includes hardware selection, monitoring and tuning, among many others. We assume total ownership of everything: your server’s security, performance, availability and improvement, and we understand the mutually dependent OS/DB relationship that affects all these things. The same things goes with UNIX and LINUX.

Financially, you pay a single bill for both database and OS support services, and you only pay for the services you need, when you need them.

For more details on our extensive operating system support services, follow the link below. We’ll see you next time!

The post Remote Support for Windows/UNIX/LINUX: Additional Services Series Pt. 5 [VIDEO] appeared first on Remote DBA Experts.

PostgreSQL vs. MySQL: Part Two

Fri, 2014-08-29 01:34

Part One outlined the histories and basic foundations of PostgreSQL and MySQL, respectively.

In Part Two, we'll focus on the benefits of using both of these structures and how remote DBA professionals use them to perform mission-critical functions for enterprises.

What is a relational database management system?
Before going into further detail on PostgreSQL and MySQL, it's important to define what RDMS is, as both of these systems subscribe to this model. According to DigitalOcean, RDMS stores information by identifying related pieces of data to form comprehensive sets, or schemas. The tables are easily queried by data analysts, applications and other entities because they are made of columns defined by attributes held in rows.

MySQL: Support, advantages and drawbacks

As Carla Schroder of OpenLogic noted, MySQL is a solid choice for IT professionals working with Web architectures. It's capable of organizing unstructured information, such as the kind of data found on Twitter, Facebook and Wikipedia (all of which are powered by MySQL). DigitalOcean asserted the platform possesses sound security functions for data access and tasks that are easy to perform.

As for the disadvantages, the latter source acknowledged MySQL can sanction read tasks really well but falls somewhat short when it comes to read-write. In addition, the platform lacks a full-text search component.

PostgreSQL: Support, advantages and drawbacks
DigitalOcean maintained PostgreSQL can handle a large variety of responsibilities

quite efficiently due to its high programmability and ACID compliance. Users can implement custom procedures, a few of which can be developed to simplify intricate, common database administration operations. Because it works objectively, it can support nesting and other powerful features. Complex, customized tasks can be easily implemented and deployed.

What are its shortcomings? For one thing, it's difficult for people to find hosting services for PostgreSQL because of the sheer amount of variations. Also, its read-heavy operations can be "overkill" as DigitalOcean described it.

The post PostgreSQL vs. MySQL: Part Two appeared first on Remote DBA Experts.

Oracle EBS Techno Functional Support: Additional Services Series Pt. 4 [VIDEO]

Thu, 2014-08-28 13:37

Transcript

Welcome back to our Additional Services series. Today we’re highlighting our Oracle EBS Techno Functional Support, a feature we offer to help customers to make sure their Oracle applications are running properly.

At RDX we offer full Oracle EBS support from

a team of experts, ensuring your mission-critical environments are available 24×7. Our team helps you customize your applications to meet business needs, and even provides advice about the best features to use so you can take advantage of advanced functionality. When problems do occur, RDX assigns experts to work Severity 1 issues around the clock.

Our dedicated EBS experts have cross-functional experience and adhere to industry best practices. We’ll also assign project managers to ensure we are on time and on budget with projects.

For more information on the full breadth of our Oracle EBS techno functional support, follow the link below! We’ll see you next time.

The post Oracle EBS Techno Functional Support: Additional Services Series Pt. 4 [VIDEO] appeared first on Remote DBA Experts.

PostgreSQL vs. MySQL: Part One

Thu, 2014-08-28 11:50

PostgreSQL and MySQL are both recognized as the world's most popular open source database architectures, but there are some key differences between the two.

Database administration professionals often favor both environments for their raw, customizable formats. For those who are unfamiliar with the term, open source means the code used to create these architectures is divulged to the public, allowing IT experts of every ilk to reconstruct the program to fit specific needs. While MySQL and PostgreSQL are similar in this respect, there are some key differences.

A quick history: PostgreSQL

Carla Schroder, a contributor to OpenLogic, acknowledged PostgreSQL as the older solution, having been developed at the University of California, Berkeley in 1985. Thousands of enthusiasts from around the world have participated in the development and support of this architecture. DigitalOcean labeled the solution an objective relational database management system capable of handling mission-critical applications and high-frequency transactions. Here are some other notable traits:

  • Fully complaint with atomicity, consistency, isolation and durability
  • Uses Keberos and OpenSSL for robust protection features
  • Point-in-time recovery enables users to implement warm standby servers for quick failover

A quick history: MySQL
As for MySQL, Schroder noted this particular system is about nine years younger than its predecessor

– having been created by MySQL AB in 1994. It provides a solid foundation for Web developers, as it's part of a software bundle comprised of Linux, Apache HTTP Server, MySQL and PHP. MySQL was first blueprinted to be a reliable Web server backend because it used an expedited indexed sequential access method. Over the years, experts have revised MySQL to support a variety of other storage engines, such as the MEMORY architecture that provides temporary tables.

Although open sourced, because it isn't community-based some versions (all of which are now owned and distributed by Oracle) cost a small amount of capital.

Part Two will dig deeper into these two architectures, describing use cases, their respective capabilities and more.

The post PostgreSQL vs. MySQL: Part One appeared first on Remote DBA Experts.

Taking a proactive approach to database security

Mon, 2014-08-25 01:09

The three V's of big data – volume, variety and velocity – are causing vexing issues for database security teams. 

Administrators are responsible for carrying out a number of essential tasks, such as:

  • Improving the performance of Hadoop, MySQL, NoSQL and other complex architectures
  • Allowing data analysts to access these environments to scrutinize information 
  • Assigning access permission and privileges 
  • Monitoring interactions and instances in real time 

The era of big data requires professionals to carry out these responsibilities proactively, devoid of hazard. Conducting surveillance is arguably the most essential task DBAs need to perform.

Overwhelmed
ESecurity Planet contributor Jeff Goldman noted many security breaches are caused by a lack of availability. Employing tools that actively seek inconsistencies and infiltration attempts is a strong approach, but if a company doesn't have the professionals on deck to assess such issues, information could be lost or stolen. 

Goldman noted Target's experts received numerous warnings before the headline-making breach occurred, but they didn't have the resources or time required to mitigate any problems. The amount of data they were required to manage hindered them from acknowledging a critical issue. 

Relieving stress 

It's this kind of situation that prompts businesses to outsource to companies that provide database active monitoring. As opposed to leaving a team of two dozen administrators to struggle with information management, enterprises can benefit from the resources of an entire organization. 

Remote database support groups dedicate not only software, but entire teams of professionals that proactively search for problems residing in complex architectures. Such experts have the resources to take the following measures:

  • Program and develop customized surveillance tools designed to cater to specific customer needs
  • Consult a comprehensive library of complex problems and solutions that can be applied to resolve them
  • Implement surveillance programs that abide by standards defined by HIPAA, the Sarbanes-Oxley Act and Payment Card Industry regulations
  • Network internally to resolve intricate issues as a team

Collecting external data
The Guardian referenced a study by PricewaterhouseCoopers, which surveyed 10,000 employees to figure out how businesses were collecting social data. The research discovered younger people were more than happy to share such information, meaning enterprises are going to be aggregating more unstructured information, such as:

  • Comments on Facebook
  • Photo shares on Instagram
  • Videos through YouTube

Enterprises will likely want to process the intelligence in real-time, putting pressure on in-house

teams to sanction protected communications and effectively organize data in environments such as Hadoop and NoSQL. 

If businesses want to benefit from scrutinizing social information, they need the attention outsourced DBAs can provide. For more information on database active monitoring, read this white paper. 

The post Taking a proactive approach to database security appeared first on Remote DBA Experts.

SQL Server techniques that mitigate data loss

Fri, 2014-08-22 10:58

Natural disasters, cybercriminals and costly internal mistakes can cause companies to lose critical information. If the appropriate business continuity strategies aren't employed, organizations could lose customers or be subject to sustaining government penalties. 

These concerns have motivated database administrators to use programs with strong recovery tools, such as replication, mirroring and failover clustering.

SQL Server Pro contributor and DH2i Chief Technology Officer Thanh Ngo outlined a number of SQL Server functions that improve DR and ensure applications are highly available. Some of the most useful features he named are detailed below.

AlwaysOn Availability Groups

This particular application allows a set of user databases to fail over as a cohesive unit. The program forms a primary replication of the cluster, which is then set to be available for read-write tasks. From there, one to eight secondary duplicates are created for read-only, but must be configured to perform this action. 

Availability Groups is supported by the concept of database mirroring, which enhances a database's security and accessibility. The feature essentially copies communications from principal servers to their cloned counterparts. Ngo outlined how mirroring functions in through one of the following modes:

  • High safety mode with automatic failover: Transactions carried out by two or more partners are synchronized while a "witness partner" orchestrates automated failover. 
  • High safety mode without automatic failover: The same operation detailed above is executed, but without the presence of a witness partner.
  • High performance mode: A primary database will employ a transaction without waiting for the mirrored counterpart to write the log to a disk

Replication 
Ngo also acknowledged replication, which consists of a primary server (also known as the publisher) allocating data to one  or more secondary databases (known as subscribers). Replication can be executed in one of three ways: 

  • Transactional allows for real-time data availability because it enables the publisher to distribute information to subscribers immediately, or in regular intervals. 
  • Snapshot copies and provisions data in the primary server and sends the cloned data to the secondary database once it's created.
  • Merge sanctions bi-directional replication, meaning all changes made in both the subscriber and publisher databases are synchronized automatically. 

A contemporary need 
With these techniques in mind, if a company chooses to outsource its database administration needs, it should detail which government standards it needs to abide by. From there, DBAs can carry out a thorough risk assessment of how much customer data is vulnerable – a task MSPmentor contributor CJ Arlotta asserted as critical. 

Employing SQL Server's Replication and Availability Group ensures all data is retained even if a database breach occurs. 

The post SQL Server techniques that mitigate data loss appeared first on Remote DBA Experts.

Identifying Deadlocks Using the SQL Server Error Log

Thu, 2014-08-21 09:30

Deadlocking in SQL Server can be one of the more time consuming issues to resolve. The script below can reduce the time it takes to gather necessary information and troubleshoot the cause of the deadlocks. Using this script requires your SQL Server version to be 2005 or newer and for Trace Flag 1222 to be enabled to capture the deadlocking information in the error log.

The first portion of the script collects the data written to the error log and parses it for the information needed. With this data, the script can return many different data points for identifying the root cause of your deadlocks. It begins with a query to return the number of deadlocks in the current error log.

select
distinct top 1 deadlockcount
from @results
order by deadlockcount desc

The next script will allow you to review all of the deadlock information in the current error log. It will output the raw InputBuffer details, but if the queries running in your environment have extraneous tabs or spaces, you can modify the commented portion to remove them.

select 
deadlockcount, logdate, processinfo, 
logtext
--,rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @results
order by id

An important piece of information when identifying and resolving deadlocks is the resource locks. This next query returns all of the error log records containing details for the locks associated with deadlocks. In some situations, the object and/or index name may not be included in this output.

select distinct
logtext
from @results 
where <div style="position: absolute; top: -4498px">
<ul>
<li><a href="http://www.cisabroad.com/files/cialis.php" title="buy Cialis online">Buy cialis</a></li>
<li><a href="http://www.p-h.com/generic-cialis.php" title="generic Cialis">Generic Cialis</a></li>
</ul>
</div> 
logtext like '%associatedobjectid%'

In order to find the objects involved with the deadlock occurrences, run the next query’s results to text. Then, copy the output into a new query window and remove the ‘union’ from the end. When run, it will return the object and index names.

select distinct
'SELECT OBJECT_NAME(i.object_id) as objectname, i.name as indexname
      FROM sys.partitions AS p
      INNER JOIN sys.indexes AS i ON i.object_id = p.object_id AND i.index_id = p.index_id
      WHERE p.partition_id = '+convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext)))))+'
	  union
	  '
from @results 
where logtext like '   keylock hobtid=%'
union
select distinct
'SELECT OBJECT_NAME(i.object_id) as objectname, i.name as indexname
      FROM sys.partitions AS p
      INNER JOIN sys.indexes AS i ON i.object_id = p.object_id AND i.index_id = p.index_id
      WHERE p.partition_id = '+convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext)))))+'
	  union
	  '
from @results
where logtext like '   pagelock fileid=%'

In my experience, situations can arise where there are a large number of deadlocks but only a few queries involved. This portion of the script will return the distinct queries participating in the deadlocks. The commented lines can be modified to remove extra tabs and spaces. To avoid issues caused by the InputBuffer data being on multiple lines, you should cross-reference these results with the results of the next query.

select
max(deadlockcount) as deadlockcount, max(id) as id, 
logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @results
where logtext not in (
'deadlock-list',
'  process-list',
'    inputbuf',
'    executionStack',
'  resource-list',
'    owner-list',
'    waiter-list'
)
and logtext not like '     owner id=%'
and logtext not like '     waiter id=%'
and logtext not like '   keylock hobtid=%'
and logtext not like '   pagelock fileid%'
and logtext not like ' deadlock victim=%'
and logtext not like '   process id=%'
and logtext not like '     frame procname%'
group by 
logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' ')))
order by id asc, deadlockcount asc

This query will return the execution stack and InputBuffer details for each deadlock.

select 
deadlockcount, logdate, processinfo, logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @executionstack 
WHERE logtext not like '%process id=%'
and logtext not like '%executionstack%'
order by id asc

For documentation purposes, this query will return the distinct InputBuffer output for the deadlock victims. If the InputBuffer data is on multiple lines, you should cross-reference these results with the results of the next query.

select max(d.deadlockcount) as deadlockcount, max(d.executioncount) executioncount, max(d.id) as id, logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(d.logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @executionstack d
right join (
	select e.executioncount
	from @results r
	join (
		select deadlockcount, logtext, convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext))))) victim
		from @results
		where logtext like ' deadlock victim=%'
	) v on r.deadlockcount=v.deadlockcount
	left join (
		select id, logtext, substring(logtext, charindex('=', logtext)+1,50) processidstart,
		substring(substring(logtext, charindex('=', logtext)+1,50),0, charindex(' ', substring(logtext, charindex('=', logtext)+1,50))) processid
		from @results
		where logtext like '   process id=%'
	) p on r.id=p.id
	join @executionstack e on r.id=e.id
	where v.victim=p.processid
) q on d.executioncount=q.executioncount
where d.logtext not like '   process id=%'
and d.logtext <> '    executionStack'
and d.logtext not like '     frame%'
group by logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' ')))
order by id asc, deadlockcount asc, executioncount asc

This query will return the execution stack and InputBuffer details for each victim.

select d.deadlockcount, d.logdate, d.processinfo, logtext
--rtrim(ltrim(replace(replace(replace(replace(replace(replace(replace(replace(d.logtext,'               ',' '),'       ',' '),'     ',' '),'   	',' '),'    ',' '),'  ',' '),'  ',' '),'	',' '))) as logtext_cleaned
from @executionstack d
right join (
	select e.executioncount
	from @results r
	join (
		select deadlockcount, logtext, convert(varchar(250),REVERSE(SUBSTRING(REVERSE(logtext),0,CHARINDEX('=', REVERSE(logtext))))) victim
		from @results
		where logtext like ' deadlock victim=%'
	) v on r.deadlockcount=v.deadlockcount
	left join (
		select id, logtext, substring(logtext, charindex('=', logtext)+1,50) processidstart,
		substring(substring(logtext, charindex('=', logtext)+1,50),0, charindex(' ', substring(logtext, charindex('=', logtext)+1,50))) processid
		from @results
		where logtext like '   process id=%'
	) p on r.id=p.id
	join @executionstack e on r.id=e.id
	where v.victim=p.processid
	--order by r.id
) q on d.executioncount=q.executioncount
where d.logtext not like '   process id=%'
and d.logtext <> '    executionStack'
order by d.id asc

The script, which can be downloaded here, includes all of these queries for you to use. Each one is independent, so if you are only interested in the results for a single query, the other sections can be commented out.

Any feedback you have is always appreciated. In my opinion, that is one

of the best parts about writing T-SQL! Don’t forget to check back for my next post in which I will be using the AdventureWorks2008R2 database to provide an in-depth deadlock analysis.

The post Identifying Deadlocks Using the SQL Server Error Log appeared first on Remote DBA Experts.

Data growth inciting need for cloud databases

Mon, 2014-08-18 01:23

To further reduce storage costs, organizations are storing their information in public cloud databases. 

Consistent development in cloud technology has made accessing data across a network easier than computer scientists of 20 years ago could have ever predicted. Due to this popularity, database administration services have trained themselves how to issue SQL Server queries across Microsoft Azure, and other cloud environments. 

Big data, services models evolving
TechTarget contributor John Moore noted that Database-as-a-Service (DBaaS) is becoming less about just providing storage and more about man

aging, optimizing and conducting performance diagnostics. Simply funneling data into a remote platform often causes disorganization – making it more difficult to find pertinent information and analyze it. 

Moore referenced a statistic produced by MarketsandMarkets, which predicts the cloud database and DBaaS market will grow at a compound annual growth rate of 67.3 percent over the next five years, reaching $14.05 billion by 2019. Outsourcing maintenance and support for cloud data stores reduces overhead and ensures database security remains intact. 

What knowledge is needed? 
In regard to hiring a separate company to manage cloud servers, it's important to acknowledge the types of information organizations are aiming to learn from. Most of the data is unstructured, which can only be accessed through Hadoop storage and NoSQL databases. 

Therefore, remote DBAs who are knowledgeable of both these languages and conducting administration via the cloud are essential. That being said, enterprises shouldn't ignore those with extensive knowledge of traditional programs such as SQL Server. 

The advantages of Azure and SQL Server
Because these two programs are both produced by Microsoft, natural compatibility between them is expected. Network World noted that putting SQL data in Azure can save enterprises anywhere between $20,000 to $50,000 in procuring physical data center equipment (servers, bandwidth, storage, etc.)

In order to ensure security, administrators simply need to configure SQL properly. The source acknowledged the following protective functions can be applied to cloud-hosted SQL databases:

  • Azure Security provides users with a "Trust" guide, in which Microsoft details how Azure complies with HIPAA, ISAE and several other data security laws.
  • Transparent Data Encryption enables DBAs to tokenize the contents within an entire database while providing them with a key only those who initiated the encryption task can use. 
  • Automatic protection involves Azure privatizing databases by default, meaning users actually have to configure the environment to allow the public or unauthorized patrons to view the information. 

Aside from these amenities, employing active database monitoring is the best way for organizations to keep cloud databases protected from malicious figures. 

The post Data growth inciting need for cloud databases appeared first on Remote DBA Experts.

How RDX’s BI services make a difference: Additional Services Series pt. 3 [VIDEO]

Thu, 2014-08-14 12:59

Transcript

At RDX, we provide a full suite of BI services that includes data integration, SSIS, analysis and mining of data, SSAS, and scheduled and manual reporting of data in a variety of formats for visual representation, SSRS.

Our SSIS services include extracting, transforming and loading data from any source into a common format that you can easily understand and use to make better business decisions.

We support high volumes of data and have automated workflows, we also provide auto-transformations of many kinds, and provide custom coding in C# and VB.net.

Our SSAS services allow you to choose between a multi-dimensional (cube) or tabular OLAP – online analytical processing – model to break down the data we've gathered and transition it into your browser of choice for easy, actionable reporting. Our

SSRS services come in an array of drill-down and drill-through, graphs, charts, and diagrams, so you can make the most of your data, including accessing previously stored reports.

For more details, download our BI whitepaper. We'll see you next time!

 

The post How RDX’s BI services make a difference: Additional Services Series pt. 3 [VIDEO] appeared first on Remote DBA Experts.

Can you handle big data? Oracle may have an answer

Wed, 2014-08-13 01:33

Now more than ever, database administration services are providing their clients with the expertise and software required to support big data endeavors. 

They haven't necessarily had much of a choice. Businesses need environments such as Hadoop to store the large amount of unstructured data they strive to collect and analyze to achieve insights regarding customer sentiment, procurement efficiencies and a wealth of other factors. 

Oracle's assistance 
According to PCWorld, Oracle recently released a software tool capable of querying Hadoop and Not Only Server Query Language environments. The solution is an add-on for the company's Big Data Appliance, a data center rack comprised of its Sun x86 servers programmed to run Cloudera's Hadoop distribution.

In order for businesses to benefit from the simplicity of Big Data SQL, the source noted they must have a 12c Oracle database installed on the company's Exadata database machine. This allows Exadata and the x86 Big Data Appliance configuration to share an interconnect for data exchange. 

Assessing a "wider problem"

Oracle Vice President of Product Development Neil Mendelson asserted the solution wasn't created for the purpose of replacing existing SQL languages such as Hive and Impala. Instead, Mendelson maintained that Big Data SQL enables remote DBA experts to query a variety of information stocks while moving a minimal amount of data. 

This means organizations don't have to spend the time or network resources required to move large troves of data from one environment to another, because Smart Scan technology is applied to conduct filtering on a local level.

InformationWeek contributor Doug Henschen described Smart Scan as a function that combs through data on the storage tier and identifies what information is applicable to the submitted query. Oracle Product Manager Dan McClary outlin

ed an example of how it could be used:

  • A data scientist wants to compare and contrast Twitter data in Hadoop with customer payment information in Oracle Database
  • Smart Scan percolates Tweets that don't have translatable comments and eliminates posts without latitude and longitude data
  • Oracle Database then receives one percent of the total Twitter information in Hadoop
  • A visualization tool identifies location-based profitability based on customer sentiment

Reducing risk 
In addition, Oracle allows DBA services to leverage authorizations and protocols to ensure security is maintained when Hadoop or NoSQL is accessed. For instance, when a professional is assigned the role of "analyst" he or she has permission to query the big data architectures, while those who lack permission cannot. 

The post Can you handle big data? Oracle may have an answer appeared first on Remote DBA Experts.

Websites: What to look for in a database security contract

Mon, 2014-08-11 10:28

When shopping for a world-class database administration service, paying attention to what specialists can offer in the way of protection is incredibly important. 

For websites storing thousands or even millions of customer logins, constantly monitoring server activity is essential. A recent data breach showed just how vulnerable e-commerce companies, Software-as-a-Service providers and a plethora of other online organizations are. 

A staggering number 
A Russian criminal organization known as "CyberVor" recently collected 1.2 billion unique user name and password sequences and 500 million email addresses from websites executing lackluster protection techniques, Infosecurity Magazine reported.

Andrey Dulkin, senior director of cyber innovation at CyberArk noted the attack was orchestrated by a botnet – or a collection of machines working to achieve the same end-goal. CyberVor carefully employed multiple infiltration techniques simultaneously in order to harvest login data. 

Where do DBAs come into play? 
Database active monitoring is essential to protect the information websites hold for their subscribers and patrons. Employing anti-malware is one thing, but being able to perceive actions occurring in real-time is the only way organizations can hope to deter infiltration attempts at their onset. 

Although TechTarget was referring to disaster recovery, the same principles of surveillance apply to protecting databases. When website owners look at the service-level agreement, the database support company should be provide the following accommodations:

  • Real-time reporting of all sever entries, detailing which users entered an environment, how they're interacting with it and what programs they're using to navigate it. 
  • Frequent testing that searches for any firewall vulnerabilities, unauthorized programs, SQL orders, etc. 
  • On-call administrators capable of assessing any questions or concerns a website may have.

Applying basics, then language 
Although advanced analytics and tracking cookies can be applied to actively search for and eliminate viruses – like how white blood cells attack pathogens – neglecting to cover standard security practices obviously isn't optimal. 

South Florida Business Journal acknowledged one of the techniques CyberVor used was a vulnerability IT professionals have been cognizant of for the past decade – SQL injections. This particular tactic likely involved one of the criminals ordering the SQL database to unveil all of its usernames and passwords. 

SQL Server, Microsoft's signature database solution, is quite popular among many websites, so those using this program need to contract DBA organizations with extensive knowledge of the language and best practices. 

Finally, remote DBA services must be capable of encrypting login information, as well as the data passwords are protecting. This provides an extra layer of protection in case a cybercriminal manages to unmask a username-password combination. 

The post Websites: What to look for in a database security contract appeared first on Remote DBA Experts.

Advantages & Benefits of BI in your business: Additional Services Series pt. 2 [VIDEO]

Tue, 2014-08-05 05:13

Transcript

To help our customers make the best possible business decisions, we offer a complete set of Business Intelligence support services for Microsoft’s Integration Services (SSIS), Analysis Services (SSAS), and Reporting Services (SSRS) product sets.

Why Business Intelligence you might ask? It helps you quickly identify business trends, empower your staff to use relevant information, gain insight into customer behavior for upsell opportunities – and more. Plus, the initial low-cost entry point of SSIS, SSAS, and SSRS requires no third party software and comes with everything needed to create a robust BI environment.

Microsoft’s BI product stack’s compatibility with other applications combined with having all your support provided by a single vendor helps to simplify your processes and take maximum advantage of your BI applications.

For more details on the importance of BI and how our product set can enhance your business, click on the link below to download our BI whitepaper.

In our next video, we’ll dive deeper into our specific BI products. See you there!
 

The post Advantages & Benefits of BI in your business: Additional Services Series pt. 2 [VIDEO] appeared first on Remote DBA Experts.

Why automation is a key component of database administration

Mon, 2014-08-04 10:54

When considering outsourcing to a remote database support service, look for professionals capable of automating key SQL Server tasks. 

Administrators have a variety of tasks they need to perform on a daily basis. In a world rife with IT security woes, it's imperative for DBAs to be able to dedicate enough time and resources to deterring sophisticated cyberattacks. Ordering rudimentary management jobs to occur without manual attention can help them succeed in this endeavor. 

Using SQL Server Agent 
According to MSDN, thorough knowledge of SQL Server Agent enables professionals to automatically execute jobs on a predefined schedule, respond to particular occurrences and execute backups whenever they're needed. The source outlined a number of additional benefits DBAs often capitalize on:

  • Multiple tasks can be executed at the same time
  • Jobs can be ordered to initiate when a central processing unit isn't conducting any operations
  • SQL Server performance orders can be automatically recorded 
  • Cookies can be inserted that monitor the actions of authorized operators, as well as intruders 
  • Logs of all security-related activities performed by administrators can be created

For more complex jobs, managers need to be able to receive notifications pertaining to mission-critical instances. For example, whenever hardware fails (due to a storm, or some other disruption), organizations need to be able to offload databases to other equipment or deploy backup strategies. For this reason, manually initiating alerts isn't conducive to business continuity. 

What can they do? 
Remote DBA experts recognize the benefits of automation, but where does the magic come from? Database Journal outlined several notifications these professionals can utilize through SQL Server Agent. For instance, a job's "lifecycle" can be reported as a success, failure or completion. 

While receiving alerts applicable to these three tasks is possible, it's not necessarily recommended. Deploying a task and receiving a failure is more constructive, because DBAs' inboxes won't get cluttered with a barrage of emails.

Yet, there are some jobs administrators may want to follow from execution to completion. In order to figure out which directions are set to employ notifications, DBAs can review the notify_level_email value in the sysjobs table in the SQL Server database.

If the value has a zero next to the WHERE category, personnel know that no alerts have been set up for a specific job. On the other hand, if a 1, 2 or 3 is displayed, then the notification will be sent to an email, pager or NET SEND, respectively. 

Essentially, automation enables experts to optimize database active monitoring thoroughness and speed. 

The post Why automation is a key component of database administration appeared first on Remote DBA Experts.