Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 1 hour 6 min ago

Sports communities using data analytics

Fri, 2014-03-21 01:00

Professional sports franchises and even some members of the National Collegiate Athletic Association are turning to data analytics tools in order to better predict the outcome of competitions. Database administration services employed by these organizations often monitor programs capable of developing training programs and game plans based on algorithmic conclusions. 

Educators taking advantage of the technology 
According to the Massachusetts Institute of Technology, the university's annual Sloan Sports Analytics Conference has positioned the school as a crucial part of the growing sports analytics industry. The two-day event typically attracts a sold-out audience consisting of almost 3,000 attendees, about 10 percent of them owners, players and representatives from some of the most reputable professional teams in the United States and Europe. 

Daryl Morey, a 2000 MIT graduate and managing director of basketball operations for the Houston Rockets, stated that many sports organizations in the U.S. have hired database support services to manage a working infrastructure for their analytics applications. He told the source that teams are using the tools to develop in-game strategies as well as business plans. 

The article noted baseball's utilization of advanced metrics programs such as value over replacement player, a system created by Keith Woolner that simulates how much an athlete contributes to his or her team in comparison to a near-average stand-in teammate at the same position. 

Taking it to the next level 
Due to the amount of money invested in professional and even a few collegiate sports teams, new developments in analytics tools are sure to provide those organizations with more accurate, plentiful information. Lauren Brousell, a contributor to CIO, noted that remote DBA experts have regulated programs capable of replacing the judgment calls of baseball umpires, supplying fans with customized digital information and delivering real-time surveillance. 

The news source stated that technology vendors are trying to capitalize on the wearable technology market. Fitness trackers are becoming popular among everyday consumers, but sports teams are looking to take the next step. Athletic apparel company Adidas recently created devices players can attach to their jerseys. 

"Data from the device shows the coach who the top performers are and who needs rest,"  wrote. "It also provides real-time stats on each player, such as speed, heart rate and acceleration."

Some tools are being developed to show what will draw fans to sports venues. John Forese, senior vice president and general manager of Live Analytics, stated that knowing specifics like whether a person is interested in an opposing team coming to town can be valuable intelligence for franchises. 

Public sector may benefit from language analytics tools

Fri, 2014-03-21 00:55

Government entities have been expanding their network capabilities, consulting database experts and deploying new technologies in an effort to bring better services to constituents. As the United States consistently engages with foreign nations, agencies frequently encounter language barriers when opening communications with diplomats. Furthermore, immigrants new to communities may not be able to accurately express concerns to state, town or city officials. 

Not as ahead as one would think 
Although some federal entities such as the U.S. National Security Agency possess incredible surveillance and data analysis technologies, outlying bureaucracies at the state and local levels are lagging behind. Taylor Armerding, a contributor to CIO, noted that the private sector has made considerable use of digital information analytics applications. The source claimed that a general lack of understanding of big data and database administration is apparent among legislators. 

Another factor hindering the public sector from surpassing private enterprises are bureaucracies and political pressures from representatives and officials. Armerding noted that many corporations don't typically have to subvert these obstructions because they're not as prevalent in a business environment. In addition, federal and state agencies merely see big data as a way to collect intelligence as opposed to using it to shape policies. 

Obtaining a higher level of understanding 
Language translation isn't new to computing. However, database support services have yet to encounter a program that can accurately decipher the emotional context of communications, whether written or spoken. As the NSA and U.S. Central Intelligence Agency continuously attain intelligence from foreign and domestic entities, they typically rely on translators and dialecticians to provide subtext for conversations. 

Big data analytics tools may provide such an avenue. According to InformationWeek, Geoffrey Pullum, professor of general linguistics at the University of Edinburgh, noted that computers require three distinct features in order for them to properly understand language:

  • Enough syntax to uniquely identify a sentence
  • An adequate amount of semantics to identify its literal meaning
  • Sufficient pragmatics to discern the intent behind the phrase

The source noted that The Outside View, a London-based company, is currently developing a voice analytics application capable of studying the mood and quality of sales calls to determine the probability of a deal being closed. Email is easier for the software to decipher, but phone calls complicate the process. 

Possibly with the assistance of database experts, government agencies could implement a program capable of identifying true emotions. For example, if a person is feigning anger, the application may falsely identify what the speaker is actually implying. 

It’s Not a Zebra: Debugging Your Code Efficiently

Thu, 2014-03-20 08:05

If you’re in a pasture and hear hoofbeats…it’s probably not a zebra.

I have heard this saying several times in the past. It basically means to check the most obvious first.

A doctor once told me this when my daughter was having stomach issues. After weeks of every kind of test for scary medical issues without finding anything wrong, we went to a second doctor. He told us “If you’re in a pasture and hear hoof beats…it could possibly be a zebra….but it’s probably just a horse”. In other words, it could be stomach cancer, could be ulcers, etc. Our first doctor was checking for all the “zebras” before eliminating the possibility of the more common “horse”. It turns out my teenage daughter was just suffering from some stressful issues at school and all was well once this was resolved.

As a DBA, I am often presented with extremely complex code to debug, optimize, or rewrite. Occasionally, we are even lucky enough to find an obscure bug in SQL Server that causes strange behavior. I was presented with a scenario recently that had me confused and flustered. I’m actually embarrassed to admit how much time this issue took me to resolve.

A user presented me with a question concerning the usage of variables in a LIKE operation. He was getting very unpredictable results and asked me to review his sample code. All three iterations of his code appeared to be equivalent and should have returned the same results. He was getting very different results with each one.

Sample code:

Image and video hosting by TinyPic

I realize that using preceding % in a LIKE is inefficient, but the usage of the variable in the sample above all appeared correct.

The actual results are shown below:

Image and video hosting by TinyPic

Hmmm…

This had me confused and led me down many paths. I searched the web for hotfixes, connect items, issues with the LIKE command in 2008 R2, and anything else I could think of. I couldn’t find instances of anyone else reporting this type of issue. I decided there had to be an issue with the underlying data.

I removed the ‘COUNT(*)’ and replaced it with the P.name column name so I could view the data. Now things were really confusing.

Image and video hosting by TinyPic

The results from the first two queries looked nothing like I expected. Many rows had no visible ‘FR’ characters:

The last query however did return the expected results:

Image and video hosting by TinyPic

I even examined the hex and varbinary values of the underlying data to look for strange hidden characters:

Image and video hosting by TinyPic

Image and video hosting by TinyPic

At this point, I had to take a break. After a little caffeine in my system and a few minutes away, I started again. That’s when I noticed something I should have noticed an hour before….
declare @queryvar nvarchar;

Eureka! The variable was declared as nvarchar with no size given.

I changed my variable to a size of 4 and reran my queries:
declare @queryvar nvarchar(4);

Now, the results were the same (and were correct) for all 3 queries:

Image and video hosting by TinyPic

Just to verify the earlier behavior, I checked the length of my original variable.

Image and video hosting by TinyPic

And got the expected length of 1:

Image and video hosting by TinyPic

A couple of quick searches revealed that SQL server data types of nchar and nvarchar default to a length of 1 when no length is specified. When used in a CAST, the default length is 30. For more information, please visit

http://technet.microsoft.com/en-us/library/ms186939(v=sql.105).aspx

I sent the information back to my customer and all was well.

There are two lessons I learned from this debugging experience. The first is to not be lazy when declaring variables – always specify the length. The more important lesson that I was reminded of that day is, “If you’re in a pasture and hear hoof beats…it’s probably not a zebra”. Before diving into any deep analysis or looking for the strange and obscure errors, go back to the beginning and recheck your basic query information. You’ll probably find your “horse”!

Image and video hosting by TinyPic

Thank you for reading my first RDX blog post! Please check back soon for more posts.

Root Cause Corrective Action Reports

Wed, 2014-03-19 12:04

Computer architectures are complex by nature. Actually, stating they are complex is like describing the Titanic as having experienced “a small leak”. If they weren’t, many of us wouldn’t be in this profession. Our services wouldn’t be needed. However, as we all know, computer architectures are not becoming administrator free  or self healing as many industry pundits would have us believe. Instead, they are continuing to increase in complexity.

Computer systems are also inherently dynamic. Application software changes, product component interaction, workload variations and system configuration alterations result in computing ecosystems that are constantly changing. Sometimes the results of these changes and new interactions are good, other times – not so much.

I don’t care what software you are using; you will run into the “only occurs on this release, on this version of the operating system, using this particular feature on the third Tuesday of the sixth month when it’s cloudy outside” BUG. Be sure to expect management to stop by and ask, “Well, why didn’t you test this on the third Tuesday of the sixth month when it was cloudy outside?”

There’s also human error. Any time humans are involved in anything, errors can, and will, occur. If you never make mistakes, send me a resume. I’m always looking for a “Patron Saint of Databases” here at RDX. It will also save us on travel costs because I’m sure you’ll be able to spread your wings and fly here on your own. The key to success is to have a documented set of procedures that ensures a continuous quality improvement process. One of those components in that process is the Root Cause Corrective Action Report which identifies human errors, software bugs and product component incompatibilities – whatever the root cause is- and documents it, so we can evaluate and analyze all of the pertinent information to ensure that it doesn’t happen again.

As my old boss used to tell me (when I was a VERY junior DBA), “It really doesn’t make a difference who broke what. You are the technician who is ultimately responsible for fixing it and making sure it doesn’t happen again. The buck stops with you. If you can’t protect your environments, you aren’t doing your job.” We all know he’s absolutely correct. That’s the strategy we have implemented from day one at RDX. We assume total ownership for all databases we support.

Ownership comes with responsibilities. One of the most critical responsibilities we bear as support professionals is to ensure the systems we support are highly available and high performance. When they aren’t, we need to find out why.

In addition, it is common business courtesy to inform your fellow business personnel impacted by an outage of the root cause, causal factors and the steps you and your team will take to ensure the problem will not impact them again. It makes no difference if it is a product bug, a problem caused by an employee in another department or a different organization entirely, ownership means ensuring that the necessary actions are taken to identify the root cause, contributing factors and implement the necessary preventative measures to prevent recurrence.

One of the mechanisms you can deploy to facilitate this effort is the Root Cause Corrective Action Report (RCCAR) documentation process. A more detailed description of the RCCAR’s contents follows:

 

Root Cause Corrective Action Event Identification Information

This first section of the RCCAR provides information pertaining to the problem that occurred. It is important to provide as much information as possible on exactly what the impact was.

RCCAR Date
Date RCCAR was distributed.

RCCAR Identifier
Information that uniquely identifies this Root Cause Corrective Action Document. RDX uses the customer name and the date the event occurred (CFOOTLOGISTICS – 03192014). RDX stores all RCCARS in an online repository for analysis.

Event Date
The date the event occurred. Because RDX supports a global customer base, we provide dates and times in the customers’ time zones.

Impact Length
The length of the impact.

Applications Affected
The information provided here can be an application, set of applications, business groups or a specific job, feature or function that was impacted.

Server Affected
Server or servers that were impacted by the event.

Database Affected
Database or databases that were impacted by the event.

Number of Users Affected
The number of users affected by the event can be challenging to identify. The support professional involved in the event most often needs to confer with their application contacts to estimate the number of users affected.

Event Impact
A description of the impact on the business that was caused by the event. If you don’t know the impact, you’ll need to contact the appropriate personnel to identify who and/or what process was affected. Here’s a couple of examples to get you started:

  • Revenue loss
  • Idle employees
  • Unable to perform business functions (list them)
  • Loss of customer goodwill
  • Financial penalties
  • Reputation impact

 

Problem Description

This section is used to provide a detailed description of the issue that led to the event and the resulting impact on the business community. The challenge is to describe the actual problem without becoming so technically detailed that the only people that can understand the description are other IT support professionals.

It is important to remember that the RCCAAR document’s audience may contain business and/or management personnel. The balancing act is to describe the problem in non-technical terms without resorting to stating “the server broke”.

 

Chronology

The chronology section provides benefits to the support professionals involved in the problem as well as the RCCAR’s intended recipients. Outlining each activity that occurred leading up to the event allows all of the technicians involved to consolidate their collective actions and firmly solidify what steps were actually performed. Formally documenting a chronology of the administrative and processing activities that occurred facilitates the problem resolution process and often leads to the  identification of the true root cause and contributing factors.

The key is to get the actions in the right order based on the time the activity occurred. The action could be an administrative activity, workload or specific job being executed. The chronology is broken down into the following sections:

  • Any administrative or application workload activities that occurred before the problem was identified
  • Description of the activity or application generated process that caused the issue
  • List of actions taken to identify and solve the problem
  • Monitoring activities performed to ensure the problem was corrected and not recur

 

Chronology Sample Sunday March 16, 2014

1.) 11:30 PM EST CFOOT Logistics experiences an unusually high CPU load on CFOOTSV1. RDX monitoring software generates a critical alert.

2.) 11:38 PM EST RDX’s PMRC team logs into server and discovers an operating system process consuming 56% CPU utilization. The process was started at 11:20 PM EST from the customer’s CRONTAB.

3.) 11:45 PM EST RDX Primary DBA, Jeff Klepner, and PMRC Assistant Manager, Ian Johnson, discuss issue with customer’s on call support technician, Jeremy Frost, and receive permission to terminate the CPU intensive process.

4.) 11:50 PM EST RDX Primary DBA continues work with customer’s on call support technician to identify root cause of hanging script.

Monday March 17, 2014

1.) 12:05 PM EST RDX Primary DBA identifies that customer personnel recently updated the CRONTAB. The customer’s on call support technician affirms that the hanging process did not follow standard change control procedures and will follow up with author. Customer contact states that the new script has not been installed on any other CFOOT Logistics servers. RDX Primary DBA offers to review script source code at customer’s convenience. Also offers to assist customer with change management process.

2.) 1:30 PM EST RDX’s PMRC team continues to closely monitor all CFOOT Logistics servers.

And so on……

 

Root Cause of the Problem

This is not a restatement of the problem, but the true root cause. In the above case, the root cause of the problem was a new job that was recently implemented that was improperly coded and did not follow the customer’s application change management procedures. It could be that the root cause matches the problem but there may be times when the root cause is different.

 

Contributing Factors

Did anything else contribute to the event? In our example, the following questions would need to be asked to identify the contributing factors:

  • Was the author educated in the organization’s change management procedures?
  • Does the change management process include a formal review, a “second set of eyes” on all code that is being implemented in production?
  • Was the new code sufficiently tested? What are the required testing procedures? Are they documented?
  • What is the employee’s training and experience level in appropriate coding techniques? Does the employee have sufficient training?

You can see by the questions above that the contributing factors often identify the future preventative actions shown below to prevent the event from recurring.

 

Resolution

This is a restatement of the corrective actions taken and does not have to be overly detailed. It allows the audience to clearly understand what the resolution was. In our example, terminating the process that was consuming excessive CPU was the resolution.

 

Future Prevention (Action Items):

List the steps that will be taken to prevent the problem from recurring. Be descriptive, detailed and thorough. This is the most important section of the RCCAR. The future preventative actions provide benefits to both the author of the RCCAR, as well as the document’s recipients. In the example we are using, enforcing a review, that “second set of eyes” before implementing script changes in production for environments would be just one very worthwhile preventative action.

The Preventative Action Section should include a detailed description of the preventative action, who will perform that action and a completion date.

 

Wrapup

Your fellow employees that are impacted by a problem, whatever the root cause and causal factors, deserve to understand what happened. It is your responsibility as a support professional to take all of the steps necessary to identify the root cause, contributing factors and future preventative actions.

The RCCAR process also provides benefits to its authors because it helps them to consolidate, solidify and evaluate each of the individual activities that occurred. The RCCAR documentation allows the support group to analyze each of the activities to improve the quality of support they provide, identify improvements in problem identification procedures and ensure that the all preventable problems do not reoccur.

What does the future hold for database security?

Wed, 2014-03-19 10:45

Private corporations and public entities are engaged in a seemingly endless cybersecurity war with organized criminals, individual hackers, terrorist groups and a list of other virtual enemies. Deviant behavior has existed since the inception of mankind, prompting white hat entities to implement defense strategies to deter offenders. On the computing battlefield, most corporations have hired database experts to secure their networks. 

Reaping the sown information 
Organizations ignorant of the consequences that can result from a well-orchestrated cyberattack often try to mitigate the problem after a successful infiltration attempt. According to The Daily News Online, a database breach at the Catholic Archdiocese of Seattle potentially compromised thousands of Social Security numbers. An estimated 10 teachers along with an unknown number of volunteers at St. Rose Catholic School in Longview, Wash., are among the likely victims. 

The news source stated that hackers used the information to file fraudulent tax returns. Database support services were consulted by the archdiocese after the ordeal, discovering that many victims became aware of the issue when the United States Internal Revenue Service rejected their tax filing both online and through mail. 

In addition, the religious organization reported the infiltration to the Federal Bureau of Investigation and consulted with a New York-based security firm in an attempt to identify the perpetrators. 

Improving techniques
It's not uncommon for corporations to set up multiple security measures to defend their networks against various network threats. Candice So, a contributor to ITBusiness.ca, noted that the arsenal of virtual weapons the average capable hacker possesses often discourages organizations desperate to maintain protection. However, Stephen Trilling, senior vice president of a major defense technology company, detailed a new method of database administration that may provide businesses with a viable solution. 

Speaking at a keynote address in San Francisco on Feb. 25, Trilling stated that organizations typically employ several products to perform different tasks, such as implementing firewalls, monitoring endpoints and identifying malware among other procedures. He claimed that the next step is to create a system in which all of these security assets communicate with one other to prepare for full-scale attacks on computer networks. 

The professional noted that it's ideal for companies to rely on database administration services to protect their digital information and provide them with a comprehensive view of all virtual operations. This combination of human and technological visibility is the best insurance to possess, as it will prevent the most sophisticated attacks from being successful. 

IT industry progresses away from on-premise responsibilities

Tue, 2014-03-18 01:44

The advent of virtualization tools is driving businesses away from in-house network management and toward outsourced database administration services. It's not uncommon for enterprises handling massive amounts of digital information to favor an option that doesn't require a massive data center to adequately host applications. Managing a system that's relatively new to corporate IT infrastructures typically poses a challenge for organizations wishing to decrease expenses without sacrificing performance capabilities.

Staying ahead of the game and knowing how to play it
The bottom line for many companies is that their specialty isn't in database administration, but their practices are streamlined and supported by the technology. CIO Magazine contributor Stephanie Overby corresponded with IT specialist Craig Wright, who stated that virtualization has provided organizations with a comprehensive platform in which all networking necessities can be accessed. Whereas the traditional data center required on-premise operating procedures such as hardware installation and cable running, the updated environment requires IT professionals to comprehend the active distribution of operating systems across multiple platforms.

"Utilizing cloud and virtualization opportunities to execute remote infrastructure and management services, organizations are able to leverage multiple providers in low cost delivery centers," Wright told the news source.

Ultimately, choosing an appropriate remote database support provider starts with recognizing which one understands the need to administrate a solution that appears complete to end users. Those that understand how company employees interact with the environment will prove best able to handle the infrastructure's operability.

Follow the trends
A number of IT veterans and rookies agree that working for a company specializing in database support services, cloud deployments or software vending is a good way to ensure job stability. Whereas internal IT departments are often viewed as necessary expenses for companies partaking in a different industry, organizations delivering computing solutions perceive their personnel as profit-yielding assets. For this reason, many of those new to the market often seek employment with corporations heavily entrenched in IT services.

Therefore, most of the talent can be found in outsourced companies. According to IT Business Net, as virtualization and cloud computing grow more complex, the level of expertise required to effectively manage and operate large databases will increase. Data collection is now a common practice with the majority of enterprises in the United States. With more methods of obtaining digital information opening up, networks may become more complex and at risk to security threats, prompting a new reactionary sub-industry in the form of protection as a service.

Enterprises invest in remote database support for reduced costs

Mon, 2014-03-17 09:44

Eager to capitalize on the benefits associated with analytics tools and embrace a cost-efficient way of doing so, enterprises throughout the United States are outsourcing to database administration services. The networking capabilities of immense data centers and cloud deployments have intimidated businesses with limited IT staffs and budgets. Relying on experts to constantly monitor network activity is cheaper than some executives would expect. 

It's only getting bigger 
According to a report released by Research and Markets, the database administration industry is expected to grow at a compound annual growth rate of 8.5 percent over the next four years, reaching $6.35 billion by 2018. The study covered the rise in demand for IP address management, performance surveillance, traffic monitoring, security maintenance and mobile device organization. 

Analysts with the source stated that the rise in demand is due in part to a lack of confidence in in-house IT staff capabilities. Expansion into cloud infrastructure has yielded profitable results for the majority of U.S. enterprises, but they feel that database experts thoroughly acclimated to the technology are better able to manage these complex systems.

"The network management services market is expected to grow from $1.26 billion in 2013 to $2.08 billion in 2018 at a CAGR of 19.5 percent during the forecast period," stated the report. 

Other contributing factors 
The general consensus among business professionals in the U.S. is that outsourcing to remote database support is cheaper and more effective. Quality service is more cost-efficient with these companies because they possess a greater command of the digital language and can readily anticipate network shutdowns before they occur. 

Tom Nolle, an IT expert and contributor to online magazine No Jitter, acknowledged increased dependence on outside organizations, greater database infrastructure complexity and faster network adaptation as the three key factors as to why outsourcing proves to be the better option. 

"Today, many workers are totally dependent on network-delivered information, to the point that their productivity falls instantly to zero if the network breaks," said Nolle. 

The architecture of the average IT solution is vastly complicated, littered with firewalls, routers, application delivery controllers and load portioning accompanied by a seemingly endless stream of administration tools. In the past, when networks were more simple, remote DBA wasn't as necessary as it is today. Intricate database systems and frameworks have proved too complicated for an on-premise IT department with limited knowledge of new technology. 

Expansion and new innovations in the industry are sure to drive organizations toward outsourcing, whether to save money or ensure optimal maintenance. 

Cloud adoption fosters need for database administration

Mon, 2014-03-17 01:29

IT professionals throughout the world have debated how to best secure cloud servers from the malevolent figures looking to gain access to company data. Conventional firewalls primarily implemented for on-premise data centers haven't made the cut, leaving remote database support organizations to devise new ways of protecting the digital information belonging to their clients. 

Starting with the basic connections 
Before worrying about mobile communications, organizations need to ensure that in-house PCs and devices are not compromising data confidentiality. According to Rick Harvey, a contributor to Network Computing, many cloud providers believe their products to be secure because their data is encrypted. However, some vendors preoccupied with expanding storage capabilities often neglect where the key is held. 

"One cloud storage provider touts its server-side encryption as freeing customers from the hassle and risk of managing their own encryption and decryption keys," noted Harvey "In reality, this leaves the user's information vulnerable to snoops."

As a result, many cloud providers employ server-side security to maintain control of the system, which requires them to protect every data-hosting asset a client may own, from disks to routers. A favorable option entails having database administration maintain an in-house presence, allowing them to gain a better understanding of a client's networking assets. 

Ultimately, familiarity with the company's server landscape will give database experts the ability to move past the basic infrastructure and client-side security. Although entrusting them with the encryption keys may seem risky, assigning one person to protect one key is a better option than trying to secure a collection of three dozen. 

Mobile's not going away, so secure it 
Business employees feel empowered by their ability to read and edit company documents on their smartphones. However, being able to do so means that they're creating avenue by which a malicious figure could exploit and gain access to sensitive corporate material. A common problem for the average IT department lies in managing all the different remote user logins. 

CSO magazine reported that NetIQ recently developed what it believes to be a solution to the latter problem. Ian Yip, a security specialist with the company, told the source that the new tool, MobileAccess, enables secure access through a single touch from iOS or Android mobile devices to essential business services and data. 

"If a user has access to an on-premise instance of SAP and an account with Office 365, log in with a single sign on to MobileAccess will authenticate users to those services," the article stated. 

The program is sure to assist remote DBA experts looking to provide security for clients utilizing mobile communications. 

Oracle Database 12c: New Features Overview

Fri, 2014-03-14 06:46

In case you missed RDX’s presentation on the new features released with the latest version of the Oracle RDBMS software, today’s blog post will recap the major features that provide some of the biggest changes from previous releases of the database:

  • Multitenant databases
  • Automatic data storage and compression based on usage patterns
  • New features to Oracle MAA (Maximum Availability Architecture)
  • Security
  • Implementing Database as a Service using Cloud Control 12c

Multitenant Databases

Those of you who are familiar with Microsoft SQL Server are already familiar with this notion of multiple databases in one instance, but this is something completely new for the Oracle world. It allows for multiple databases to run inside one instance of the software using pluggable databases. This allows segregation of data but also consolidation to make resources more efficiently utilized. This feature is available with all database editions, but Standard Edition only allows for one pluggable database per container database. Enterprise allows multiple pluggable databases per container.

Container Databases

Let’s dig into what a container database is and look at how this is a departure from the older releases of the database software. The container database holds the control files and online redo logs, gets parameters from the instances SPFILE, and has its own SYSTEM, SYSAUX, UNDO and TEMP tablespaces. The container database also houses the data dictionary and security information shared among the pluggable databases. Container databases hold the pluggable databases, which contain their own security and data that cannot be seen or accessed by another pluggable database unless the privileges have been granted to allow it.

Pluggable Databases

Pluggable databases are exactly what their name implies. They can be unplugged and plugged into a container database. They can contain multiple schemas and have their own tablespaces and local security. The best example to explain this feature is a container database for the Foo Corporation that has pluggable databases for Accounting, HR and Sales which isolate the data and security to those that need access while consuming less resources than if they each were individual Oracle instances.

Some of the biggest benefits we’ve seen for the implementation of pluggable databases are the ability to easily clone and upgrade pluggable databases. Since they are governed by container databases, they can be unplugged from one container database and plugged into an upgraded container database, and the upgrade of the pluggable database is complete. The same goes for cloning of production databases to other environments. The pluggable database can be copied from one container database and plugged into another for refreshing development, QA or testing systems.

Automatically Optimizing Data Storage and Compression

It seems that data retention is on the rise, but IT managers have a conflicting objective to keep storage costs down while still having efficiently performing databases and ability to scale. Normally, this can be introduced to the environment using some form of Information Lifecycle Management (ILM) that moves data based on age or temperature (frequency of access or modification), but this usually involves physically moving data from different storage types in some manual way. In 12c, the Advanced Compression Option has added in some ILM features called Automatic Data Optimization and Heat Mapping.

Automatic Data Optimization

Automatic Data Optimization (ADO) monitors the temperature and age of data and stores it in tables and partitions and changes the information over time to keep the heat map accurate. There are three temperatures of data in the heat map. Hot data has recently been added and is typically used by OLTP applications. As that data ages, it becomes warm. It is typically used by business intelligence users and is often considered to be in a read-only state. Data that is retained for compliance or reporting purposes and is rarely accessed is considered cold.

Smart Compression and Storage Tiering

Using the heat map leads us to our next topic of smart compression and storage tiering. Each table can have different storage and compression policies based on the information collected by the heat map. Having the heat map allows data to be moved to different storage tiers and makes best use of storage costs by putting cold data on cheaper, slower storage or having compressed and hot data remain on fast more expensive storage.

Oracle’s Maximum Availability Architecture

With the growing demands of the digital world today, more and more businesses are requesting that critical applications be available 24x7x365 which doesn’t leave much time for the necessary maintenance needed to keep the supported database functioning at peak performance. In the 12c Database release, several modifications and additions to the architecture have made some maintenance tasks available online while the users are accessing the data.

Online Operations

Several of the schema maintenance DDL commands have been modified in 12c to remove blocking locks while performing the operation. The commands that have been modified are:

  • DROP INDEX ONLINE
  • DROP CONSTRAINT ONLINE
  • SET UNUSED COLUMN ONLINE
  • ALTER INDEX UNUSABLE ONLINE
  • ALTER INDEX [VISIBLE |INVISIBLE ]

Along with these DDL operations, moving datafiles while users are accessing the tablespaces has been changed to allow for online moves to new file systems or into ASM.

ASM Improvements

One enhancement to note with ASM in 12c is disk scrubbing. Disk scrubbing will check blocks for logical corruption and repair the issue automatically when the diskgroup is configured with normal or high redundancy. This operation utilizes the ASM rebalancing operations to minimize I/O overhead.

Global Data Services

Global Data Services is a new concept that allows applications to connect to a single service name that can actually be a combination of databases utilizing GoldenGate replication or Active Data Guard and single-instance or RAC databases using a single name. Oracle provides a whitepaper on Global Data Services that can be found here.

RMAN Enhancements

Several enhancements have been made to RMAN in 12c as well. The RMAN command interface has been modified to allow for SQL statements to be run without having to prefix the line with the SQL operator. This enhancement supports the SELECT and DESCRIBE statements that are used in SQL*Plus.

Enhancements to the DUPLICATE, BACKUP and RESTORE commands in 12c have performance enhancements to get commands to complete faster utilizing parallel processes. The DUPLICATE FROM ACTIVE DATABASE command has been improved to allow for datafiles to be split into smaller segments and compressed when transported to the auxiliary database to utilize multiple channels to restore one datafile. This is extremely useful for larger BIGFILE tablespaces that don’t have size limits. The BACKUP and RESTORE commands now contain the ability to create backups that can be used on different platforms. Before this feature in 12c, transportable tablespaces were the fastest way to convert a database over to a new platform, and this was only introduced in 10g. The RESTORE command was further improved to allow for a RECOVER TABLE operation that can restore a single table or subset of tables without having to restore the whole database and export the requested tables.

Data Guard Enhancements

With disaster recovery being such an important component of the Oracle MAA, the 12c enhancements to Data Guard allow for role transitions and monitoring using the broker more reliable and less complex by implementing single command role transitions and additional health check, error reporting and problem resolution.

Database Security

With the recent security breaches, security is becoming an increasing concern for all aspects of applications, and 12c has made further enhancements to isolating data and users to only the necessary privileges and objects to function. The areas that we will focus on here are:

  • Data Dictionary Changes
  • Advanced Security
  • Transparent Data Encryption
  • Data Redaction

Data Dictionary Changes

Some of the most important data in the database has been held in the data dictionary. Since 8i, Oracle has been changing security around the data dictionary with each release, and 12c is no different. One of the biggest changes to note is that the SELECTION ANY DICTIONARY role has removed the ability to query some of the most sensitive tables associated with security such as the DEFAULT_PWD$, ENC$, LINK$, USER$, USER_HISTORY$ and XS$_VERIFIERS. Additional roles have been created to allow for further lockdown of the SYSDBA privilege. The SYSBACKUP role has been created to allow a user to backup and recover the database without needed full SYSDBA privileges. Another role has been created for the administration of Data Guard operations called SYSDG.

Oracle Advanced Security

The Advanced Security database option allows for additional protection of sensitive data within the database and also during transport of the data over the network, either to applications or backup/export locations. The two ways this is implemented is with the use of Transparent Data Encryption and Data Redaction.

Transparent Data Encryption

Transparent Data Encryption is part of the Advanced Security Option. It allows for data to be encrypted on disk and can be as granular as a column or include an entire tablespace. Consolidation of management tools and commands has been integrated in with Enterprise Manager or through the command line to make key management simpler. An additional security role has been created for TDE called the SYSKM that allows for management of key stores and master encryption keys.

Data Redaction

With the use of data redaction, users can see all, a portion or none of the sensitive data. For example, when a customer calls a call center, the last four digits of the caller’s social security number are used to verify their identity, so the call center users don’t need to see the full social security number. Data redaction would allow the database to hold the full social security number but only display the last four digits through the call center application.

Fore more information, the Oracle white paper on TDE and Data REdaction can be found here.

Implementing Database as a Service Using Cloud Control 12c

For some larger corporations, database services can be charged to a business unit so that IT costs can be factored into operational costs of the division. In 12c, Oracle has introduced a way for internal business groups within a company to request database environments with a self-service portal using Cloud Control. This allows business groups to request a new schema, a whole new database based on a template or even a new database server.

To learn more about this, visit Oracle’s site on Cloud Management.

These topics only scratch the surface of the new features available in Oracle’s 12c Database release and they don’t even touch the management capabilities added into 12c Cloud Control. Check back for blog posts later on this and many other topics related to all things Oracle in the coming months. You can also visit RDX at Booth 835 at Collaborate next month to learn more about these new features.

Database obstacles: Size matters

Fri, 2014-03-14 02:16

Corporations throughout the United States are looking to capitalize on the influx of digital information the current market has been producing. In order to do so, upgrading server capabilities or transitioning them into the cloud has been on many executives' agendas. To help them complete these endeavors, database support services​ provide extensive surveillance and maintenance assistance. 

Deducing the average workload
Noel Yuhanna, an analyst with Forrester Research, claimed that remote database support companies manage 40 servers on average. However, there are extreme disparities hidden in this conclusion. The professional stated that some experts oversee 275 while others administrate eight. The reasoning behind this extensive incongruity lies in the size of the databases as well as the sophistication of tools used by DBA experts.

Yuhanna claimed that larger, terabyte-sized servers often necessitate additional tuning, backup, recovery and upgrade initiatives in order to ensure that they're operating to their full potential. Contemporary database support services can usually handle about five terabytes per server.

"In other words, one DBA can effective manage 25 databases of 200GB each or five 1 terabyte databases," said Yuhanna. "These include production and non-production databases."

However, the capabilities of these professionals have expanded thanks to improvements in cloud applications and automation tools. In addition, many of these organizations are employing recent graduates who have received more extensive education in the technology. 

​Where the contention lies
Datavail recently conducted a survey in conjunction with Forrester Consulting in an attempt to identify common database problems the modern organization typically encounters. An analyst from the corporation stated that complex networks of multiple servers for a single client have alienated the capabilities of in-house IT departments and instigated outsourcing efforts. The expanding size of databases has put more pressure on enterprises to maintain security and reliability, but their budget impact necessitates an outside perspective. 

The conjoined organizations deduced that companies that attempt to handle database administration in-house typically encounter the following issues:

  • About 85 percent claimed they have difficulty delivering optimal performance
  • Nearly 75 percent reported trouble integrating their data
  • Approximately 71 percent lack the necessary personnel
  • An additional 71 percent couldn't properly secure private data
  • Finally, 69 percent stated that they couldn't handle the rapidly growing volume of digital information

Enterprises that chose to outsource to database administration services claimed that the aforementioned issues were resolved. Handing the responsibility of managing an increasing amount of servers to those best suited for the job can result in significantly reduced labor costs along with streamlined data processing.

A booming cloud market has IT pros wanting more

Thu, 2014-03-13 09:00

The cloud computing market is far past its phase as a trend or buzzword – the industry has been growing at a rapid rate for years now and shows no sign of stopping any time soon. Businesses cannot seem to get enough of cloud-based storage, infrastructure and application services, especially now that mobile device management has become a more prominent factor in the enterprise tech landscape. While some IT security concerns may be holding the cloud back from attaining its maximum level of impact on the business world, the market points to a future in which these issues will seem antiquated. However, with more businesses jumping onto the cloud bandwagon, high-profile tech jobs are at a premium and will only become more valuable as the years roll along in the cloud's favor.

Demand for cloud services results in big numbers for the market
Revenue speaks for itself regarding the popularity and influence of any given technology. In the case of the cloud, private enterprise will triple its spending on related services, applications and infrastructure from 2011 to 2017, according to a press release examining a recent report from IHS Inc. Even from 2013, when the market was valued at around $145 billion, the number this year is expected to reach $174 billion, a 20 percent increase. As the industry continues to supply a cloud-crazy IT landscape that ranges beyond the private sector into nonprofit, government and health services organizations, the market was estimated to reach a value of $235 billion by 2017, roughly three times its $78 billion valuation in 2011.

IT experts seek a pretty penny for their services
Ensuring database quality, security and general network performance has always been a top priority for organizations, and most will spend whatever it takes to enjoy peace of mind when it comes to their IT procurements. According to a recent report from IDG News Service, career website Dice.com has conducted research revealing that IT talent is expecting better pay, improved working conditions and more responsibility for company policy in 2014 as a result of the cloud computing movement. IT retention will prove difficult this year as 65 percent of survey respondents were confident that they could find a better position if they wanted to leave their current jobs. Highly skilled roles will always be in demand, but companies will have to take special care of their tech talent as the cloud gains more steam.

Database experts expect big happenings from Oracle in 2014

Thu, 2014-03-13 01:51

Oracle has been at the forefront of IT innovation for over a decade. The solutions it's provided for organizations throughout the world have transformed the industry's platform. Success stories litter the Internet.

CSX, a rail transportation supplier, recently contracted Oracle experts to implement the hardware and software vendor's database management system. Prior to the migration, CSX had more than 400 machines supporting the company's critical commercial and proprietary business applications, Database Trends and Applications reported. The organization used Oracle Real Application Testing in conjunction with Enterprise Manager to reduce business impact and downtime during the transition. 

A database administration service assisted the company in upgrading CSX's system in less than half the time required for the enterprise's previous update. The testing program provided the business with a comprehensive view of changes made to the data infrastructure and resolved issues before the solution was deployed. On the other end of the program, Enterprise Manager used the digital information obtained through Real Application Testing to review how prepackaged and custom SQL workloads will be affected in the new database. 

"The fact that the upgrade was completed with no business disruption and in half the time of our last upgrade was a crucial win for our organization," said Maritza Gonzalez, technical director of data management for CSX.

Currently, the rail transportation company is using Oracle Enterprise Manager 12c with the assistance of a remote database support group to oversee and administrate over 500 databases. 

What's in store? 
According to Forbes, Oracle recently brought Bluekai, a digital marketing firm specializing in cloud deployment, under its IT umbrella. The article stated that the acquisition is a sign that the organization is relying heavily on enterprises moving away from traditional advertising campaigns and toward the customized, virtual counterpart. 

Despite 16 percent growth during 2013, Oracle remained in the shadow of its competition with the broad composite index standing at 41 percent. In an effort to maintain pace, it is believed that Oracle regards the recent acquisition as a positive step toward expediting revenue gains. Among the eight procurements the company made last year, the majority of them are heavily involved in cloud-based Software-as-a-Service (SaaS) marketing and business-to-consumer software. 

"The growth in mobile and other hand-held devices globally induced a shift in customer buying process from a sales-driven approach to a marketing-driven approach," the source noted. 

The organization's willingness to keep up with the times may result in a significant return on investment. Online retailers continue to deploy digital marketing software and it looks as if Oracle is hoping to provide it. 

Is BYOD falling short in today’s tech landscape?

Wed, 2014-03-12 05:41

The bring-your-own-device (BYOD) movement has received plenty of press and praise for its forward-thinking approach to mobile integration within the workplace. Employees love the flexibility of being able to access business information and use applications to their hearts' desire, whenever and wherever they want. Executives also enjoy using their own smartphones and tablets on the job, but are more likely to appreciate the increased productivity, collaborative efforts and cost savings they see from a mobile plan that doesn't limit workers from conducting business the way that best suits them.

Of course, not all feedback has been positive with regard to BYOD, especially in light of recent high profile data breaches that have sent retailers and financial groups frantically trying to evaluate and rejuvenate their security measures. One of the most common criticisms of BYOD is the perception that it creates vulnerabilities within a company's network and allows employees to risk their devices falling into the wrong hands. These IT security concerns have kept personal technology from reaching its full potential in the business environment, but companies are still doing their best to find a way for the movement to succeed and reap the benefits that BYOD has promised from the beginning.

Where BYOD is missing the mark
Although the concept has come a long way from its early stages, BYOD technology is still encountering growing pains as it seeks to become a staple of enterprise IT strategy. According to a recent article from InformationWeek, mobile device management (MDM) software is attempting to harness the power of BYOD but with a variety of outcomes. On one hand, dedicated security platforms designed specifically to enhance productivity and data protection with regard to mobile are becoming more popular for businesses that don't want to risk leaking information. However, there are still companies using basic, application-specific MDM approaches that don't offer the widespread compatibility necessary for adequate network defense.

Despite decision-makers differing on how or whether MDM should be utilized at all, here are a handful of components that BYOD policies will need to take into account if they are to succeed. InformationWeek explained that devices will have to be ready for consistent updates and have an option to remotely wipe data in case the phone or tablet is lost or stolen. Without these basics, IT departments shouldn't expect much from their BYOD efforts.

Computing complexity increases demand for DBA services

Wed, 2014-03-12 02:12

Due to the high level of competition among operating system providers and software developers, computing code has grown much more complicated. Although organizations strive to make OS easy to work with, new applications capable of communicating with other devices have prompted businesses to hire database administration services to monitor these interactions. The flexibility of telecommuting is not an element the typical enterprise is willing to sacrifice, even if it has to adopt a more intricate OS.

Adjusting to the demise of the desktop? 
Consumers wouldn't have dreamed of such a scenario ten or even five years ago. The vast majority of organizations still use in-house machines connected to their databases, but mobile devices are eroding what was once a PC-centric environment.

Paul Gilster, a contributor to News Observer, reported that the market for desktops and laptops has decreased over the past three years. Despite Microsoft's recent release of Windows 8.1, adoption of the OS has been relatively lackluster, claiming a mere 4 percent of the global OS market. 

In an effort to bounce back, many software developers are trying to push dual OS, a move that many database experts have criticized as an impractical solution. Intel claimed that the ultimate goal is to allow seamless movement between Windows software and the applications found on an Android phone or tablet. 

Gilster claimed that such an endeavor may not be tolerated by PC users. The typical employee relies on these machines to perform everyday tasks that have grown in complexity over the years. Introducing a dual OS system would only frustrate users who don't employ the majority of the pre-existing features on their desktops in the first place.

Is it worth the cost of protection?
As each OS possesses different code, remote database support has to adjust the security protocol for these machines in order to ensure network communication isn't compromised. For example, if a business used PCs containing two OS, these professionals would have to completely redesign the company's protection architecture. 

According to Secunia, desktops are just the tip of the iceberg. The source stated that the majority of security flaws lie in the multitude of applications employees use on a daily basis. Mobile devices have perpetuated this trend and allowed workers to place their data into company databases. 

"With a 73.5 percent share, the primary attack vector available to attackers to trigger a vulnerability in 2013 was remote network," the report stated.

Although such connectivity has been a boon for many businesses, it may be advisable for them to hire remote database services capable of protecting distant communications. 

Acoustic processing going mobile?

Tue, 2014-03-11 10:08

Data collected through online surveys, e-commerce transactions and page visits is one matter, but obtaining digital information from everyday sounds could cause many businesses to consult database support services for assistance. As if analytics couldn't get any bigger, IT technicians are currently developing technologies capable of identifying acoustics. 

In contrast to producing dormant data, the program creates active information capable of communicating with network users and diagnosing machine performance issues. According to Phys.org, researchers at the Polytechnic University of Catalonia (UPC) have constructed wireless sensors capable of recognizing minuscule fissures in industry machinery. The technology receives and translates the ultrasonic sounds the instruments produce in order to determine whether or not a breakdown is about to occur. 

If the sensors encounter widespread adoption, it is likely that remote DBA support organizations will be hired to monitor their activity throughout factory networks. Depending on how large the facility or complex the operation, the amount of instruments deployed to clients may vary considerably. 

Team leader Luis Romeral, along with a few other representatives from organizations in Ireland, Estonia and Poland, developed the electronics used to fabricate the mechanism. The result is a self-contained device that functions off the heat, mechanical or vibration sources in the environment and uses algorithms to detect and identify problems.

"The sensor was designed to capture ultrasonic waves at ranges that move between 50 and 300 kilohertz," explained Romeral. "At this frequency range structural faults can be found in mechanical systems."

What comes after? 
Some experts believe that the technology could be used to alert pedestrians listening to their music of oncoming vehicles. Information Week reported that scientists are currently designing a machine-learning program that scrutinizes ambient audio to predict subsequent outcomes. One Llama Labs, a New York City-based innovator of acoustic analyzing systems, claimed that the processors and battery life of mobile devices are causing the technology to go mainstream. 

David Tcheng, the company's co-founder and chief science officer, told the news source that developments in acoustic identification could change the world of surveillance. One Llama Labs' Audio Aware machine-learning application is capable of processing hundreds of sounds from its surroundings. The organization's original intent was to develop a program that provides music recommendations based on tone and ambience, as opposed to artist and album correlations. 

Aside from creating an intelligent playlist, acoustic analysis technology has the potential to be a revolutionary monitoring force within the next few years. Some organizations using the software may turn to remote DBA services to maintain connection between the mobile devices hosting the program and the databases storing the data. 

Disaster Recovery Planning

Tue, 2014-03-11 07:03

DBA’s tend to get pulled in lots of different directions in a company…performance tuning, database design, systems administration, networking, report writing; the list goes on and on. With all of the diverse tasks that a DBA must tend to every day, sometimes we forget the very core responsibility of a database administrator: protecting data. One of the fundamental aspects of protecting data is planning for disasters.

When most people think of disasters, they think large scale:

  • Fire in the Data Center
  • Flood
  • Tornado
  • Hurricane
  • Widespread Power Outage

When it comes to disasters, there are many smaller scale problems that can cause an outage that are just as critical to the data that we’re responsible for:

  • Server Blue Screen of Death
  • Failed Disk Drive
  • Corrupt Database or Corrupt Database Page
  • SQL Injection Attacks
  • The DELETE Clause When the User Forgot to Add a WHERE Clause (whoops, I just deleted all sales data!)

Microsoft SQL Server comes with a variety of high availability and disaster recovery technologies (database backups and restores, Clustering, Mirroring, Log Shipping, Replication, SQL 2012 AlwaysOn Availability Groups) to help us prepare for the worst. Each of these technologies helps us recover from specific disaster scenarios, and many of them can be combined to prepare for multiple scenarios.

Much thought and planning needs to be taken to develop a disaster recovery plan that meets the recoverability needs and budget of the organization. Here are some key questions to ask yourself and business users when planning a DR strategy:

  • How critical is each database to the business?
  • How long can the business survive without each database? 1 minute? 8 hours? 2 days?
  • Where are database backups stored? What happens if this location is unavailable?
  • When is the last time I tested my database backups to make sure they are valid?
  • Do I know how to restore the system databases if the server needs rebuilt? When is the last time I practiced this?
  • If I get hit by a bus, are there simple, easy to follow instructions that someone else can follow to get back online?
  • Does every user really need SysAdmin or Database Owner privileges?!?

Having an outline of the recoverability requirements and their associated costs will drive the DR plan that you design. For example, failover clustering is more complex and costly than log shipping. But if your primary server has a blue screen of death, you will be back online much more quickly with less data loss with a failover cluster setup. Can your business afford a couple hours’ worth of downtime or lose 15-60 minutes of data?

The summary chart below can be used to help compare the available options along with relative costs:

Image and video hosting by TinyPic
Chart Adapted from Database Mirroring in SQL Server 2005, Table 15: Comparing SQL Server 2005, http://technet.microsoft.com/library/Cc917680?

A good DR plan will target each type of disaster and what steps need to be taken to best recover with the least impact to users and the least risk for data loss. Don’t wait until a disaster strikes to start building your DR plan. A real disaster can be a pretty stressful time. As the protectors of the data, we need to be cool, calm, and collected when facing disasters. Have a pre-established plan will help alleviate a lot of pressure so that you make the best decisions to recover data efficiently.

Thank you for reading my first RDX blog post! Stay tuned for more posts soon.

It’s time for a new data assessment plan

Mon, 2014-03-10 01:53

Due to the fact that multiple organizations are shifting to different platforms, database administration support companies are being contacted by private enterprises and public agencies to conduct extensive reviews of their data hosting systems. Whether on-premise or in the cloud, amassing large amounts of data through PCs, interfaces, mobile devices and other machines has become the norm. 

Keeping a lid on it
Harnessing digital information typically entails companies new to the process biting off more than they can chew. For this reason, database experts are consulted by these businesses in order to make sense of it all. Enterprise Apps Today reported that these professionals typically start by evaluating their clients' master data. Reviewing all data sources and how it's organized – all with best practices in mind – can help save enterprises from overlooking costly errors, such as misalignment of data for assets and customers. 

"The key is to examine company data from an unbiased perspective with the most critical business processes in mind," noted the source. 

Where database support services really benefit companies is their ability to identify data inaccuracies. Because the environment in which digital information is held spans entire departments, mistakes may accidentally be introduced by different machines.

Assessing mobile devices 
Smartphones and tablets are becoming increasingly involved with database contribution, but monitoring and protecting these avenues has been an undervalued priority. Absolute Software recently surveyed 1,000 employees who use mobile phones for work. When questioned about penalties if they lose company data, 25.2 percent of respondents claimed that they shouldn't receive punishment because it wasn't their responsibility to practice safe security measures. 

This apparent lack of interest would explain why many enterprises hire remote database services to monitor connections between machines and the the network. The survey also noted an overwhelming amount of apathy. Apparently, 75 percent of corporate employees felt that they should be reprimanded for the transgression, but the extent of their punishment should be mild. 

"Although employees may feel the data on their device is only worth $500, when you consider that most data breach penalties can range into the millions of dollars, there's an obvious disconnect between an employee's perception and the risk to the organization," the source reported. 

The lack of education in regard to the true value of company data is putting corporations severely at risk. Having a remote DBA service conduct surveillance is a good first step, but enlightening employees as to the dangers of negligent data access is absolutely necessary. 

Advances in education show need for database experts

Thu, 2014-03-06 12:22

Whether in an on-premise data center or a remote cloud solution, the need for database administration services will continue to grow as the Internet activity increases. Concerns over bandwidth and storage capabilities plague the minds of executives eager to capitalize on data analytics and the market information it reveals. 

As the United States was where the Web was born, it comes as no surprise that the country will host a secondary education school offering a unique IT program. According to Computer World, Southern Methodist University, a private institution located in Dallas, Texas, is offering a graduate degree in data center engineering. The source stated that the coursework will begin with core concepts and then branch into specialty curriculums such as database administration, analytics, virtualization and cloud computing. 

Edward Forest, program director for SMU's Lyle School of Engineering, stated that the school collaborated with big name IT service providers such as Hewlett-Packard, HP and Compass Datacenters to assemble the program. The educator claimed that students can study on campus or remotely. He also stated that although similar syllabi exist, few deliver the expansiveness offered by the core classes. 

The institution hopes to fill a growing need for database support services and other related fields by attracting students to the graduate program. 

What the next step entails 
Paul Venezia, a contributor to InfoWorld, stated that the U.S. could be doing more to improve its Internet Infrastructure. He claimed that one way of doing this would involve figuratively disassembling the entire latticework and seeing which parts are essential, politically and technologically. The dissection and analysis of these parts has spawned multiple government regulations and increased business outsourcing to remote database support companies.

Venezia mirrored these concepts by wondering which platform could best harness the Internet and optimize its use. He detailed three scenarios in which the Internet could either thrive or collapse:

  • The seemingly omnipotent entity is treated like a commodity, driving quality through free market competition.
  • As with electric and telephone communication, highly regulated regional monopolies undertake its maintenance. 
  • Internet access is placed under the care of municipalities, as the country does with water and sewer systems.

Ultimately, the writer favored the first option, stating that giving consumers the ability to dictate which companies provide them with access will improve service quality. Although acknowledging security concerns, Venezia noted that such considerations will factor into the customer decision-making process. 

Whether the market is free of restraint or authoritative entities assume responsibility, the Internet is expected to grow. This anticipation is sure to spawn programs similar to the one now offered by SMU.

Data-sharing becoming a habit?

Wed, 2014-03-05 13:29

Alliances between market competitors and scientists have led some organizations to implement data-sharing tools. Whereas some executives believe that digital information should remain confidential, others are finding that collaborative efforts produce profitable results. Due to the complexity of such an operation, a number of corporations are hiring remote database support companies to connect them with other organizations. 

Reuters reported that six major banks, two of which are United States-based companies JPMorgan and Citigroup, recently contracted the services of Swift, a Brussels-based organization that provides financial institutions with an avenue to exchange monetary data. The company's new clients will be able to collect and share regulatory information many banks are required to amass as a part of their due diligence processes. 

Senior executives responsible for initiating the deal hope that the database administration service will provide their enterprises with the ability to help one another meet the increasing know-your-customer standards. The report stated that these requirements obligate banks to possess stringent KYC policies to strengthen protection against money laundering and fraud.

Although customer confidentiality remains a concern, Swift assured its clients that each financial institution will retain ownership of its own information and can disallow other banks from viewing the data if they wish. 

Preparation for disasters 
In addition to bank executives, scientists have pooled their refined data with one another to identify an incredible variety of trends. For example, because climate change has become such a widespread concern, environmental experts have relied on remote database services to allow information sharing. The ability to contribute several different statistical viewpoints on such a pressing matter has helped environmental analysts issue whitepapers detailing possible solutions. 

Scientists involved in tectonics are looking to utilize DBA services. According to KGW TV, experts at Central Washington University claim that data collected from global positioning systems designed to monitor earthquakes has helped regions throughout the world anticipate the disasters before they occur.

The article stated that the Pacific Northwest Geodetic Array tracks 500 GPS stations throughout the region in real time. The system received considerable attention after data revealed that the United States west coast is susceptible to sustaining a magnitude 9 earthquake. 

"The problem, says CWU seismologist Tim Melbourne, is that GPS networks aren't sharing their data," stated the news source. "He wants to see that changed."

Being able to consolidate such information onto a single platform could possibly save thousands of lives. As Melbourne continues to urge other scientists to share their digital estimations, the professional may seek the advice of database experts to create a network to support the data. 

Schools witness continued use of databases

Wed, 2014-03-05 01:57

In response to the No Child Left Behind Act of 2001, educators throughout the United States have implemented ways to ensure that their students are abiding by standards established by the federal government. Some school officials, in conjunction with state departments, have consulted with database administration services to deploy networks possessing a wide variety of student data across entire districts. 

Statewide ventures
One such database has already been deployed. According to Brookfield Patch, the Connecticut State Department of Education recently established its own system for assessing and distinguishing school performance. The network, monitored by a remote DBA support firm, categorizes each institution throughout the state into one of three designations based on the average results of student Connecticut Mastery Tests, Connecticut Academic Performance Tests and other exams:

  1. "Excelling" is assigned to schools possessing an SPI over 88, a four-year graduation rate over 94 percent and at least 25 percent of students partaking in advanced classes in three of four subject areas.
  2. "Progressing" means that a school has an SPI of 88, but a 10-point gap or larger exists for underachieving subgroups and the graduation rate is between 94 to 90 percent.
  3. "Transition" is designated to institutions that have an SPI between 88 and 64. The establishments are missing either the 90 percent four-year graduation standard or a subgroup gap higher than 10 points. 

An additional three categories, "review," "focus" and "turnaround," classify schools as possessing substandard performance characteristics and are typically under scrutiny by the CSDE. 

Access to standardization 
Although a number of U.S. constituents perceive standardization to be detrimental to students, the issue isn't dissipating. According to VentureBeat, Desire2Learn, a learning software provider, recently acquired the Achievement Standards Network, an online database of education standards. 

Desire2Learn informed the news source that the organization will undertake all responsibility for operating ASN's public data infrastructure. Whether or not the company will consult with database experts on how to maintain the networks by itself has not been clarified, but it's assumed that the size of the operation will convince them to do so. 

Jeremy Auger, co-founder and chief strategy officer of Desire2Learn, claimed that he hopes the transition will give clarity to teachers confounded by the wide array of differing education standards throughout the country. The effort is a part of ASN's desire to digitize learning content and simplify the materials search process. 

The deployment of these two databases signifies a revolutionary shift in the U.S. education system. Defining stringent standards and making them available for widespread consumption is sure to solidify the policies initiated by No Child Left Behind and the Common Core.