Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 55 min 44 sec ago

Disaster Recovery Planning

Tue, 2014-03-11 07:03

DBA’s tend to get pulled in lots of different directions in a company…performance tuning, database design, systems administration, networking, report writing; the list goes on and on. With all of the diverse tasks that a DBA must tend to every day, sometimes we forget the very core responsibility of a database administrator: protecting data. One of the fundamental aspects of protecting data is planning for disasters.

When most people think of disasters, they think large scale:

  • Fire in the Data Center
  • Flood
  • Tornado
  • Hurricane
  • Widespread Power Outage

When it comes to disasters, there are many smaller scale problems that can cause an outage that are just as critical to the data that we’re responsible for:

  • Server Blue Screen of Death
  • Failed Disk Drive
  • Corrupt Database or Corrupt Database Page
  • SQL Injection Attacks
  • The DELETE Clause When the User Forgot to Add a WHERE Clause (whoops, I just deleted all sales data!)

Microsoft SQL Server comes with a variety of high availability and disaster recovery technologies (database backups and restores, Clustering, Mirroring, Log Shipping, Replication, SQL 2012 AlwaysOn Availability Groups) to help us prepare for the worst. Each of these technologies helps us recover from specific disaster scenarios, and many of them can be combined to prepare for multiple scenarios.

Much thought and planning needs to be taken to develop a disaster recovery plan that meets the recoverability needs and budget of the organization. Here are some key questions to ask yourself and business users when planning a DR strategy:

  • How critical is each database to the business?
  • How long can the business survive without each database? 1 minute? 8 hours? 2 days?
  • Where are database backups stored? What happens if this location is unavailable?
  • When is the last time I tested my database backups to make sure they are valid?
  • Do I know how to restore the system databases if the server needs rebuilt? When is the last time I practiced this?
  • If I get hit by a bus, are there simple, easy to follow instructions that someone else can follow to get back online?
  • Does every user really need SysAdmin or Database Owner privileges?!?

Having an outline of the recoverability requirements and their associated costs will drive the DR plan that you design. For example, failover clustering is more complex and costly than log shipping. But if your primary server has a blue screen of death, you will be back online much more quickly with less data loss with a failover cluster setup. Can your business afford a couple hours’ worth of downtime or lose 15-60 minutes of data?

The summary chart below can be used to help compare the available options along with relative costs:

Image and video hosting by TinyPic
Chart Adapted from Database Mirroring in SQL Server 2005, Table 15: Comparing SQL Server 2005, http://technet.microsoft.com/library/Cc917680?

A good DR plan will target each type of disaster and what steps need to be taken to best recover with the least impact to users and the least risk for data loss. Don’t wait until a disaster strikes to start building your DR plan. A real disaster can be a pretty stressful time. As the protectors of the data, we need to be cool, calm, and collected when facing disasters. Have a pre-established plan will help alleviate a lot of pressure so that you make the best decisions to recover data efficiently.

Thank you for reading my first RDX blog post! Stay tuned for more posts soon.

It’s time for a new data assessment plan

Mon, 2014-03-10 01:53

Due to the fact that multiple organizations are shifting to different platforms, database administration support companies are being contacted by private enterprises and public agencies to conduct extensive reviews of their data hosting systems. Whether on-premise or in the cloud, amassing large amounts of data through PCs, interfaces, mobile devices and other machines has become the norm. 

Keeping a lid on it
Harnessing digital information typically entails companies new to the process biting off more than they can chew. For this reason, database experts are consulted by these businesses in order to make sense of it all. Enterprise Apps Today reported that these professionals typically start by evaluating their clients' master data. Reviewing all data sources and how it's organized – all with best practices in mind – can help save enterprises from overlooking costly errors, such as misalignment of data for assets and customers. 

"The key is to examine company data from an unbiased perspective with the most critical business processes in mind," noted the source. 

Where database support services really benefit companies is their ability to identify data inaccuracies. Because the environment in which digital information is held spans entire departments, mistakes may accidentally be introduced by different machines.

Assessing mobile devices 
Smartphones and tablets are becoming increasingly involved with database contribution, but monitoring and protecting these avenues has been an undervalued priority. Absolute Software recently surveyed 1,000 employees who use mobile phones for work. When questioned about penalties if they lose company data, 25.2 percent of respondents claimed that they shouldn't receive punishment because it wasn't their responsibility to practice safe security measures. 

This apparent lack of interest would explain why many enterprises hire remote database services to monitor connections between machines and the the network. The survey also noted an overwhelming amount of apathy. Apparently, 75 percent of corporate employees felt that they should be reprimanded for the transgression, but the extent of their punishment should be mild. 

"Although employees may feel the data on their device is only worth $500, when you consider that most data breach penalties can range into the millions of dollars, there's an obvious disconnect between an employee's perception and the risk to the organization," the source reported. 

The lack of education in regard to the true value of company data is putting corporations severely at risk. Having a remote DBA service conduct surveillance is a good first step, but enlightening employees as to the dangers of negligent data access is absolutely necessary. 

Advances in education show need for database experts

Thu, 2014-03-06 12:22

Whether in an on-premise data center or a remote cloud solution, the need for database administration services will continue to grow as the Internet activity increases. Concerns over bandwidth and storage capabilities plague the minds of executives eager to capitalize on data analytics and the market information it reveals. 

As the United States was where the Web was born, it comes as no surprise that the country will host a secondary education school offering a unique IT program. According to Computer World, Southern Methodist University, a private institution located in Dallas, Texas, is offering a graduate degree in data center engineering. The source stated that the coursework will begin with core concepts and then branch into specialty curriculums such as database administration, analytics, virtualization and cloud computing. 

Edward Forest, program director for SMU's Lyle School of Engineering, stated that the school collaborated with big name IT service providers such as Hewlett-Packard, HP and Compass Datacenters to assemble the program. The educator claimed that students can study on campus or remotely. He also stated that although similar syllabi exist, few deliver the expansiveness offered by the core classes. 

The institution hopes to fill a growing need for database support services and other related fields by attracting students to the graduate program. 

What the next step entails 
Paul Venezia, a contributor to InfoWorld, stated that the U.S. could be doing more to improve its Internet Infrastructure. He claimed that one way of doing this would involve figuratively disassembling the entire latticework and seeing which parts are essential, politically and technologically. The dissection and analysis of these parts has spawned multiple government regulations and increased business outsourcing to remote database support companies.

Venezia mirrored these concepts by wondering which platform could best harness the Internet and optimize its use. He detailed three scenarios in which the Internet could either thrive or collapse:

  • The seemingly omnipotent entity is treated like a commodity, driving quality through free market competition.
  • As with electric and telephone communication, highly regulated regional monopolies undertake its maintenance. 
  • Internet access is placed under the care of municipalities, as the country does with water and sewer systems.

Ultimately, the writer favored the first option, stating that giving consumers the ability to dictate which companies provide them with access will improve service quality. Although acknowledging security concerns, Venezia noted that such considerations will factor into the customer decision-making process. 

Whether the market is free of restraint or authoritative entities assume responsibility, the Internet is expected to grow. This anticipation is sure to spawn programs similar to the one now offered by SMU.

Data-sharing becoming a habit?

Wed, 2014-03-05 13:29

Alliances between market competitors and scientists have led some organizations to implement data-sharing tools. Whereas some executives believe that digital information should remain confidential, others are finding that collaborative efforts produce profitable results. Due to the complexity of such an operation, a number of corporations are hiring remote database support companies to connect them with other organizations. 

Reuters reported that six major banks, two of which are United States-based companies JPMorgan and Citigroup, recently contracted the services of Swift, a Brussels-based organization that provides financial institutions with an avenue to exchange monetary data. The company's new clients will be able to collect and share regulatory information many banks are required to amass as a part of their due diligence processes. 

Senior executives responsible for initiating the deal hope that the database administration service will provide their enterprises with the ability to help one another meet the increasing know-your-customer standards. The report stated that these requirements obligate banks to possess stringent KYC policies to strengthen protection against money laundering and fraud.

Although customer confidentiality remains a concern, Swift assured its clients that each financial institution will retain ownership of its own information and can disallow other banks from viewing the data if they wish. 

Preparation for disasters 
In addition to bank executives, scientists have pooled their refined data with one another to identify an incredible variety of trends. For example, because climate change has become such a widespread concern, environmental experts have relied on remote database services to allow information sharing. The ability to contribute several different statistical viewpoints on such a pressing matter has helped environmental analysts issue whitepapers detailing possible solutions. 

Scientists involved in tectonics are looking to utilize DBA services. According to KGW TV, experts at Central Washington University claim that data collected from global positioning systems designed to monitor earthquakes has helped regions throughout the world anticipate the disasters before they occur.

The article stated that the Pacific Northwest Geodetic Array tracks 500 GPS stations throughout the region in real time. The system received considerable attention after data revealed that the United States west coast is susceptible to sustaining a magnitude 9 earthquake. 

"The problem, says CWU seismologist Tim Melbourne, is that GPS networks aren't sharing their data," stated the news source. "He wants to see that changed."

Being able to consolidate such information onto a single platform could possibly save thousands of lives. As Melbourne continues to urge other scientists to share their digital estimations, the professional may seek the advice of database experts to create a network to support the data. 

Schools witness continued use of databases

Wed, 2014-03-05 01:57

In response to the No Child Left Behind Act of 2001, educators throughout the United States have implemented ways to ensure that their students are abiding by standards established by the federal government. Some school officials, in conjunction with state departments, have consulted with database administration services to deploy networks possessing a wide variety of student data across entire districts. 

Statewide ventures
One such database has already been deployed. According to Brookfield Patch, the Connecticut State Department of Education recently established its own system for assessing and distinguishing school performance. The network, monitored by a remote DBA support firm, categorizes each institution throughout the state into one of three designations based on the average results of student Connecticut Mastery Tests, Connecticut Academic Performance Tests and other exams:

  1. "Excelling" is assigned to schools possessing an SPI over 88, a four-year graduation rate over 94 percent and at least 25 percent of students partaking in advanced classes in three of four subject areas.
  2. "Progressing" means that a school has an SPI of 88, but a 10-point gap or larger exists for underachieving subgroups and the graduation rate is between 94 to 90 percent.
  3. "Transition" is designated to institutions that have an SPI between 88 and 64. The establishments are missing either the 90 percent four-year graduation standard or a subgroup gap higher than 10 points. 

An additional three categories, "review," "focus" and "turnaround," classify schools as possessing substandard performance characteristics and are typically under scrutiny by the CSDE. 

Access to standardization 
Although a number of U.S. constituents perceive standardization to be detrimental to students, the issue isn't dissipating. According to VentureBeat, Desire2Learn, a learning software provider, recently acquired the Achievement Standards Network, an online database of education standards. 

Desire2Learn informed the news source that the organization will undertake all responsibility for operating ASN's public data infrastructure. Whether or not the company will consult with database experts on how to maintain the networks by itself has not been clarified, but it's assumed that the size of the operation will convince them to do so. 

Jeremy Auger, co-founder and chief strategy officer of Desire2Learn, claimed that he hopes the transition will give clarity to teachers confounded by the wide array of differing education standards throughout the country. The effort is a part of ASN's desire to digitize learning content and simplify the materials search process. 

The deployment of these two databases signifies a revolutionary shift in the U.S. education system. Defining stringent standards and making them available for widespread consumption is sure to solidify the policies initiated by No Child Left Behind and the Common Core. 

In-memory the future for databases?

Tue, 2014-03-04 11:11

Because of a need for simplified internal optimization algorithms and faster data access, companies collecting huge amounts of digital information are looking to operate their databases on an in-memory platform. In response, database administration services are being consulted to help businesses transition from disk storage to the more flexible solution. 

The transition to in-memory databases originated with the advent of big data and the now-impending demise of business intelligence systems. According to Apps Tech News, before BI was introduced into the business market, organizations maintained separate repositories of information. For example, database support services held company finances in a server operating an ERP system, which couldn't be compared with the consumer-related information attained through a CRM solution. 

BI allowed organizations to consolidate these assets onto a single database, enabling C-suite officials and their subordinates to access and analyze information from a variety of contributors. However, with the introduction of the cloud and the residual technologies developed under its influence, large enterprises are now turning toward data analytics tools possessing superior digital information refinement techniques. 

Large, adaptable and fast 
Companies are realizing that hosting their big data analysis programs on in-memory databases or a cloud environment is more practical than attempting to run these systems through a disk storage option. In light of this realization, database experts have adjusted their services to cater to businesses with these desires. 

InformationWeek reported that the price of RAM associated with memory-derivative systems is declining, meaning that mechanical disks will become more expensive. The former solution is capable of processing information queries 20 to 100 times faster than the latter.

Merchandising services company Edgenet claimed that the more contemporary system has brought real-time insight into product availability for customers of major construction supply chains such as Lowe's and Home Depot. 

"In-memory capabilities have let the online gaming company Bwin.party go from supporting 12,000 bets per second to supporting up to 150,000," the source stated. 

The article also noted that competition within in the industry of in-memory data processing is heating up. Oracle and its competitors are constantly keeping tabs on one another to make sure that their technology can obtain and comb through data much more efficiently than the others'. 

As organizations – particularly large-scale enterprises – take note of the aforementioned success stories, many will turn to database support services to help them install and maintain the in-memory option.

Mobile device management catches the eye of worried parents

Fri, 2014-02-28 11:43

Private enterprises and public organizations have sought the expertise of database administration professionals to provide widespread connectivity between mobile devices and central databases. The prevalence of the cloud has fostered interest in integrating tablets and smartphones into the workplace and has given executives the ability to view market changes in real-time.

According to Federal News Radio, United States government agencies and bureaus heavily entrenched in data collection and analysis are looking to capitalize on the benefits associated with mobile device management. In one instance, the U.S. Postal Service's Inspector General's office virtualized more than 400 servers in an effort to foster a bring-your-own-device (BYOD) initiative within the organization. 

In turn, the Navy is making the transition to BYOD with the help of a database administration service. The military entity told Federal News Radio that the virtualization of the organization's servers by 2017 will serve as a means to reduce database operation costs and improve access through mobile devices. The flexibility of the solution could benefit sailors operating on large vessels as well as enhance the communication capabilities of the department as a whole. 

Good luck hiding text messages from parents 
CIO acknowledged the advent of mobile device management in large enterprises, but the source also noted a more personalized technology developed by Remo Software. The article stated that the program allows parents to regulate and monitor their kids' activity on smartphones and tablets across Android, Windows 8 and iOS, as well as PCs and Macs. Essentially, the software would give users the ability to restrict teens and their younger counterparts from viewing explicit content. 

"Parents often use devices to access work files, pay bills and store confidential data that they wouldn't want in the hands of strangers," noted CIO. 

Although the technology is sure to receive an incredible amount of criticism from adolescents, the software provides guardians with the invaluable capability of knowing where their dependents are. Remo Software CEO Omer Faiyaz told CIO that parents can also view text messages, which may spawn a new form of digital language among teens looking to ensure that communications between them and their peers remain confidential. 

This new technology, along with the federal government's willingness to integrate mobile device management into its networks will foster new interest in remote database services. Getting soldiers and civilians acclimated to the new technology is sure to require the knowledge of professionals well-versed in database operations. 

Health care industry expected to invest more in database support services

Fri, 2014-02-28 02:27

The role of the chief information officer in the United States health care industry is expected to grow over the next couple of years. In response to the Affordable Care Act, U.S. hospitals, insurance providers and health clinics are transitioning to a value-based care delivery system that will force them to seek more efficient ways to provide services. Hospital CIOs are being viewed as transformative elements. Administrators are seeking ways to change facility operations instead of simply maintaining them. 

CIO Magazine reported that increased CIO responsibility will be followed by tough decisions to make and hard questions to answer. Although transitioning to an electronic health record system is an expensive solution, it is a necessary step for optimizing hospital efficiency. With the assistance of remote DBA experts, an accessible system capable of interacting with mobile devices can provide doctors and nurses with a way to obtain patient information more quickly, as opposed to perusing filing cabinets for 15 minutes. 

After implementation occurs, the news source recommended that the CIO work in conjunction with the database administration service to educate hospital personnel on how the EHR operates. Many DBA service providers conduct training sessions aimed toward employees who possess no prior experience with new programs, but the CIO's mediation should be mandatory. 

Database administration in federal authorities
According to FierceBiotechIT, the United States Food and Drug Administration's health care surveillance database, Mini-Sentinel, contains information on over 160 million individuals, 3.5 billion medication dispensaries and 3.8 billion medical encounters, surpassing the goals set by Congress, which obligated the organization to hold records for 100 million patients by July 2012. 

The incredible amount of data contained within Mini-Sentinel has necessitated the consultation of database experts to ensure that all records are kept secure and that the system operates adequately. 

One of the key elements the database provides FDA personnel with is the ability to accurately trace the effects of various medications on patients throughout the U.S. For example, utilizing Mini-Sentinel data helped staffers assess bleeding rates and thromboembolic outcomes associated with the usage of various blood-thinning medications. 

"With many patients now taking multiple drugs over many years, the potential for unexpected interactions and long-term effects has escalated," the news source reported. 

Integrating the health care industry's information into flexible, scalable databases that are adequately monitored could potentially change the way care is delivered to consumers. Throughout the next few years, the presence of the CIO and other IT professionals in hospitals and clinics will become more apparent. 

Maintaining Your Databases on SQL Server Express Edition

Thu, 2014-02-27 10:46

Do you ever click on a bookmark to find that the website is no longer available? That scenario was the inspiration for my first post. I was posed a question regarding setting up maintenance tasks for SQL Server Express Edition and my bookmark no longer worked. Luckily, I already had a local copy of the script and its usage, but I wanted to ensure that the solution can be easily found by others. You can find an executable of Jasper Smith’s original ExpressMaint utilities for SQL Server 2005/2008/2008 R2 here: http://expressmaint.codeplex.com/. Because there have been no recent updates, I have done so and made some modifications to it. This new script can be downloaded here.

Additions and Improvements

  • Updated Error Handling
  • Forced Cleanup Parameters–
  • This will ensure no backup or report files are accidentally deleted if you do not define the variables.

  • Redesigned Specifically for SQL Server Express, Express with Tools, and Express with Advanced Services Editions
  • SQL Server 2012 Support
  • Backups for all Databases Can be Performed in One Task
  • ‘Smart’ Exclusion of Database Using the Simple Recovery Model when Performing Transaction Log Backups–
  • You can use one task to perform transaction log backups without worrying even if only one database is using the simple recovery model.

  • ‘Smart” Exclusion of Master Database from Differential Backup Task

Permissions

This script will create a stored procedure named ‘expressmaint’ in the master database by default that can then be run on a schedule through the task manager with ‘xp_cmdshell’ and ‘Ole Automation Procedures’ enabled. In SQL Server 2005, the account that will run the scheduled task must be a member of the sysadmin role in your Express Edition instance. There are two ways to set up the permissions to run this stored procedure in SQL Server 2008 and higher:

1. Add the account to the sysadmin role
2. Grant the account execute permission on the ‘expressmaint’ stored procedure in the database it was
created in and db_backupoperator in the database(s) you will be backing up. You will also need to set
up the xp_cmdshell proxy account.

Parameters

There are several options for configuring your SQL Server Express Edition maintenance tasks:

Image and video hosting by TinyPic
Image and video hosting by TinyPic
Image and video hosting by TinyPic

Scheduling

Since SQL Server Express edition does not have an Agent, the only way to schedule maintenance tasks is through the Windows task scheduler. First, you will need to create a sql file calling the expressmaint stored procedure and passing in the parameters you need. Then create a new scheduled task and set the command to run as SQLCMD.exe. The parameters to pass to sqlcmd are:

Image and video hosting by TinyPic

Examples

On Windows 2003, if your SQL Server Express edition instance name is SQLExpress2008R2 and you wish to perform full backups for all databases and generate a report text file. In this example you will be keeping one week of backups in C:\Backups and one month of report files in C:\Reports. Old backups will be deleted before creating new ones. The scheduled task will be run under a username and password that has the appropriate permissions within SQL Server.

Image and video hosting by TinyPic

On Windows Server 2008, your SQL Server Express edition instance name is the default instance and you wish to use one sql file for all maintenance tasks. The below example will run full backups for all user databases and generate a report text file. You will be keeping one week of backups in C:\UserBackups and one month of report files in C:\Reports. The scheduled task will be run under a username and password that has the appropriate permissions within SQL Server.

Image and video hosting by TinyPic

Thank you for reading my first RDX blog post. Feel free to reach out to me on Google+ or leave a comment below to start a discussion. Stay tuned for my next post in the coming weeks!

Health care industry continues to rely on remote database services

Thu, 2014-02-27 01:44

Delivering affordable care to patients remains a daunting task for many in the health care industry. Some hospitals have invested in more advanced technology and insurance companies have assembled packages that look more appealing. All of this looms over the Affordable Care Act and what it means for the industry as a whole. In search for cost efficiency, hospitals and government officials are turning to networking services and remote database support to optimize patient-to-doctor communication. 

Kaiser Health News collaborated with The Seattle Times on a report claiming that Washington State is looking to promote transparency of care expenses through a new bill. Pioneered by Gov. Jay Inslee and a team of state lawmakers, the resulting amendment would produce a statewide database listing hundreds of medical procedures and what they would cost at clinics and hospitals. The system would also provide information regarding facility reliability and staff quality.

While Washington looks for a viable database administration service, some insurance companies have expressed discontent with the proposed bill. For example, Regence Blue Shield and Premera Blue Cross claim that the new requirements will disclose the treatment prices they've negotiated with clinics and hospitals in confidence. 

"The biggest question is whether patients will seek this information and use it to shop more wisely," the news article noted. "Transparency supporters think the public is ready for them to pull back the curtain on cost and quality."

Expanding networks 
In a separate endeavor, non-profit health organization Torrance Memorial Medical Center announced its implementation of a local area network to improve wireless infrastructure in its existing hospital building and six corresponding facilities. 

Health IT Security reported that the deployment isn't coming without security concerns. The company, which treats an estimated 23,000 patients annually, is seeking database experts to help ensure patient record confidentiality. Todd Felker, the organization's infrastructure and security architect, told the news source that being able to isolate the network visitors use from the one employees utilize will be the greatest challenge. 

Felker also expressed the need for mobile device protection procedures, claiming that many of those instruments are susceptible to cyber attacks. What's more alarming is that some of them are medical mechanisms. 

"There are threats such as Distributed Denial of Service as well as hackers trying to do a scan to try to take advantage of a vulnerability on a device," he told the news source. 

Between enlightening health care consumers about the exact costs facing them and giving patient visitors the ability to work while in health care facilities, networking capabilities and limitations are sure to be tested. 

Could IT security concerns be vanquished by cloud solutions?

Wed, 2014-02-26 11:29

Perhaps one of the most prohibitive aspects of cloud services is the perception that network security suffers when compared to on-premise IT solutions that keep data within direct owner control. However, industry leaders are now seeing more opportunity than ever with regard to the security capabilities of the cloud, reaffirming the technology as a force to be reckoned with in the coming years. As cloud protection measures become more substantial and reliable, companies that remain skeptical of the movement will find themselves quickly left behind if they don't adapt soon.

Key features of cloud security come into play
A few years ago, IT leaders may not have been able to deliver a great answer when confronted with questions about cloud security strategies, but times have changed as more off-premise solutions are offering a host of protection measures not available to the first generation of adopters. Despite security becoming a more urgent and complex challenge in light of recent, highly publicized data breaches, IT experts are confident that the cloud has the potential to step up and defend organizations better than ever before. According to a WhaTech article highlighting a recent Gartner report, sensitive data may be better protected in the cloud than with traditional network solutions, leaving business leaders with little left to complain about with regard to the technology. 

"The benefits cloud security offers – particularly encryption – are making it an increasingly popular choice," Kelly Kavanagh, principal research analyst at Gartner, told the news source. "The benefits of deploying cloud-based security services are clear. Aside from the broad area of IAM, specific controls such as encryption are becoming vital to the adoption of cloud computing. They are further helping to generate interest in this particular form of security service delivery."

Organizations will require specialized security solutions
As with any aspect of corporate IT, businesses will need to find the right cloud security service to suit their particular needs. This is especially important considering the customized nature of the data breaches last year, going after retailers using point-of-sale systems. 

"Enterprises will begin to shift more of their legacy and mission-critical applications to the cloud as the economics become more compelling," Dell's President's Advisory Board was quoted as stating in a recent BizJournals blog post. Security is without a doubt mission-critical in today's IT landscape. 

Just as organizations look to integrate cloud-based infrastructure, storage and mobile device management, security will soon be entrenched in the marketplace as another sought-after solution.

Application Change Management Best Practices

Wed, 2014-02-26 11:26

Database and operating system administrators are ultimately responsible for guaranteeing the quality of their organization’s information processing environments. From protecting against unauthorized access to providing 24×7 availability – “the buck stops with the support unit.” Although the database infrastructure (DB binaries, O/S, hardware) doesn’t change much, there is one component that usually changes a lot – the application. This blog post provides readers with helpful hints and tips on application change management best practices.

I started my career working in a mainframe environment. Well-known database guru, Craig Mullins, and I administered DB2 databases on huge, big iron platforms- platforms that supported thousands (upon thousands) of concurrent users. One of the benefits of this background is that it taught us the importance of change management best practices.

Craig and I learned that a key ingredient of a trouble-free mainframe environment was ensuring that there were no “surprises” when a change was implemented in production, changes that can affect the day-to-day business operations of an entire organization. Throughout my career, I have applied these “mainframe style” best practices to all other database ecosystems that I was responsible for supporting.

It works. The first company I applied these best practices to was selected by Oracle as a “Showcase Environment”. This was back in the old days, when Scott’s tiger was just a cub. Oracle identified shops that it thought had rock-solid support environments and asked them to host visitors from other organizations that wanted to create their own high-quality support infrastructures.

My current organization, Remote DBA Experts (RDX), supports over 300 customers. We are responsible for monitoring and administering thousands (and thousands) of servers. Our customers’ environments consist of a myriad of third-party and homegrown applications running on every imaginable database/operating system/hardware combination. We provide support for all major data infrastructure products including SQL Server, Oracle, Oracle EBS, DB2 LUW, DB2 mainframe, MySQL, PostgreSQL, Windows and UNIX/Linux operating systems as well as Hadoop. During our last SSAE16 audit, we were evaluated on 15,000 individual change requests. The changes ranged from “create this user” to “please build this Oracle RAC 8 Node Cluster”. We implement a lot of changes at RDX.

It is something we are very good at. I thought I would provide you with a few helpful hints and tips on our change management best practices. Since few readers will work for a remote database and operating systems services provider, I’ll tailor my recommendations to readers working in smaller and/or newer shops that many not have a complete set of change management processes in place.

Database Design Reviews

One of my earlier blog posts provides information on database design reviews. Database design review meetings foster effective communication between the DBA unit, system support personnel and application developers throughout the entire application design and implementation process. When database design issues are addressed early in the development lifecycle, problems are minimized and the migration from test to production is more easily accomplished.

If you haven’t read the design review blog, you should. I’m intentionally not covering the importance of rigorous testing of any change before it is implemented in production because it is covered in-depth in the design review blog. From simple changes to new application implementations, there is simply no reason not to perform the test, review, change, test, review, change iteration lifecycle. Although the blog post covers new application implementations, the post will show you how important I think it is to follow a rigorous test plan.

Proceduralize the Change Request Process

Database administrators usually support different business units with each unit having their own set of unique procedural requirements. Formalizing and documenting the change request process minimizes the potential for miscommunication between the business units, application development areas and the database administration unit.

The notification lead-time required for the database administration team to perform a requested change should be documented and distributed to business and application development units. This will prevent your team for getting a request to migrate a database from test to production in the morning with a required date for that afternoon. Of course, we all know that never happens. Since our customers at RDX share support personnel with each other, following our SLAs is of utmost importance. It enables RDX to provide high-quality support to all customers. We completely understand that business needs often demand quick implementations, but we make every attempt to work with our customers to help them plan their changes in advance.

We break our response time SLAs into different categories based on the complexity of the change and the amount of work it requires. We have a different lead time for simple database object changes vs. creating that Oracle RAC 8 Node Cluster I was discussing earlier in this blog.

If your organization doesn’t have a formal change request process in place (and many shops don’t), create your own. There are dozens of work request ticketing, change management and source code versioning tools available on the market today. The prices can range from thousands to tens of thousands of dollars.

Although I highly recommend these types of products, I wouldn’t let the lack of having one prevent me from formalizing the change management process. You can do the best with what you have.

OK, so you don’t have the good fortune of having a formal change management process in place. What do you do? You can begin the formalization of the change request process by:

  • Creating standardized change request documents
  • Establishing change management meetings
  • Creating Service Level Agreements (SLAs), which include change request lead and turnaround times.
Standardized Change Request Documents

Standardized request documents help to increase the quality of the change request process. The forms are sent to the support unit by the application owner of the data to be processed. The forms can be created using a variety of different products. The mechanisms can range from sophisticated ticketing systems using web forms to simple paper documents. As I said, use what’s available. It’s not the mechanism; it’s the process that is important.

Any requests not sent or signed off by the application owner should be rejected. Keep copies of all completed work requests for auditing purposes. Application owners can be virtually any member of the organization that is identified as having the authority to sign off on change requests. The most common persons are application development team leaders, section heads, department heads, etc.. At RDX, if you aren’t identified as a “Change Data Agent” for your organization, we won’t process the ticket.

Each request form contains the following common information:

  • Form identifier – naming convention that allows the form to be easily identified
  • Application name
  • Server name (for OS requests)
  • Database name (for DB requests)
  • Name and contact information of the person requesting the change
  • Request date
  • Required date (including specific time change needs to be implemented)
  • Application owner signoff
  • Data security signoff (if required by shop standards)
  • Schema Change Area
    • Schema owner of object to be changed or created
    • Object name to be changed or created
    • Object type (i.e. table, index, view) of the object to be changed or created
    • Detailed description of change requested
    • Data administratiosign off for new data objects
  • A free form request area that further describes the change. Also provides an area for non-schema changes
  • Verification procedures – other units required to verify as well as verification procedures
  • Notification procedures – who to notify when the change is complete
  • An area that the technician will complete when executing the change that contains the following information:
    • Technician executing change
    • Technician contact information
    • Date and time change was processed
    • Verification procedures followed
    • Notification procedures followed
    Here are a few examples of specific forms that will help formalize the change request process:

    Database and OS Authorization Request Form

    This form is used for requesting authorization changes to the database and/or operating system environment.

    The Database and Operating System Authorization Request Form will include all of the requestor information contained in the previous form but will also record information pertinent to authorization requests:

    • Grantee listing for security grant or revoke
    • Type of security granted or revoked

    Production Environment Change Request Form

    This form will be used for requesting the migration of database objects (databases, table spaces, tables, indexes, etc.) from test to production and the alteration of existing production objects. In addition, the form notifies the support team to perform various database, operating system and hardware parameter and environment changes in production environments.

    Each production environment change request form must have an associated test environment change request counterpart. If the change wasn’t made in test, you don’t implement it in production. To facilitate this process, the identifier for the test change request that was used to notify the support team should be provided on the production change request form.

    The production environment change request form contains the following information pertinent to production environments:

    • Test Environment Change Request Identifier- allows technician to determine if the change was implemented in test. If no change request is found, the person tasked with implementing the request needs to determine the reason why.
    • Form identifier – naming convention that allows the form to be easily identified
    • Application name
    • Server name (for OS requests)
    • Database name (for DB requests)
    • Name and contact information of the person requesting the production change
    • Request date
    • Required date (including specific time change needs to be implemented)
    • Application owner signoff
    • Data security signoff (if required by shop standards)
    • Schema Change Area
      • Schema owner of object to be migrated or altered in production
      • Object name to be altered or migrated
      • Object type (i.e. table, index, view) of the object to be altered or migrated
      • Detailed description of change requested
      • Data administration sign off (if required by shop standards)
    • A free form request area that further describes the change. Also provides an area for non-schema changes
    • Verification procedures – Other units required to verify as well as verification procedures
    • Back off procedures – What to do it the change has an adverse effect on the system or does not work “as initially thought”
    • Notification procedures – who to notify when the change is complete
    • An area that the technician will complete when executing the change that contains the following information:
      • Technician executing change
      • Technician contact information
      • Date and time change was processed
      • Verification procedures followed
      • Notification procedures followed
    Change Management Meetings

    If you read my earlier blog post on database design review meetings, you know I’m a proponent of constant communication between all units that are involved in the change management process. How often should you hold these change management meetings? You should hold them as often as you implement objects in production. If your organization makes changes to production environments on a daily basis, the meetings should be held daily. This is not as big of an imposition on your time as you may think. We provide remote database services for several very large organizations that have these change management meetings on a daily basis. The process takes about 15 to 20 minutes, not a lot of time spent to ensure that everyone knows what is happening.

    To shorten the amount of time these meetings consume and to make them as productive as possible, the following discussion items should be a standard part of the meeting’s agenda:

    • Application name being changed
    • Date and time change will be implemented
    • Change description
    • Potential business impact if the changes don’t go as expected (include both units affected and how they will be affected)
    • Verification procedures
    • Back-off procedures
    • Requestor
    • Tested by
    Service Level Agreements

    Identifying support needs and expectations is required to provide high quality support. You probably won’t be meeting all of your customers’ expectations if you don’t know what any of them are. As stated previously, each application has its own unique set of support requirements and expectations. Service Level Agreements (SLAs) help to solidify support requirements and dispel any inflated expectations a business or application development unit may have. They probably won’t be aware of your current workload and resulting work request lead times until you tell them. The support team lead should meet with each unit supported to establish a set of measurable Service Level Agreements that include work request lead times, support activities required and application performance and availability objectives.

    Wrapup

    This is by no means an all-inclusive list of activities you need to perform to standardize the change request process. It is intended to give you a head start in the right direction.

IT leaders put mobility at the forefront of their strategies

Wed, 2014-02-26 09:54

Now that mobile device management is making a name for itself as a primary pillar of enterprise IT, businesses and service providers alike are searching for the best way to prioritize mobility and get the most out of personal devices in the workplace. Approaches have ranged from free-form bring-your-own-device implementation to strict, software-based device management platforms and everything in between, but there is still work to be done before mobile earns its place as a must-have component of any company regardless of sector or trade.

Mobile device management gets personal
Every business expects something different from mobile based on the unique needs of its operations, and a main obstacle for vendors has been the integration of mobile platforms within companies' pre-existing infrastructure. Fortunately for business leaders, software applications are more customizable and specialized than ever, allowing IT departments to overcome the compatibility issues that have held their mobile strategies back in the past. In fact, the freedom to tailor mobile solutions will render many device management issues obsolete.

This is why Oracle made a splash last year when it released its mobile application development framework (ADF) and opened up the playing field for business leaders leaders trying to find the right fit for their organizations. According to FierceMobileIT, this was a major step forward for the role of mobile among the larger scheme of enterprise IT.

"The most complex, time-consuming and expensive part of any enterprise mobile application has to do with how you integrate that mobile app into the enterprise," Suhas Uliyar, vice president of mobile strategy product management at Oracle, told the news source.

Front and back-end development is key
Whether an organization is looking to integrate mobile resources exclusively within its network or allow customers and clients to use their devices as well, a customized solution will be the best approach as unique challenges and goals arise down the line. 

Database experts to assist companies in cloud migration

Wed, 2014-02-26 02:04

Operating through an on-premise data center is gradually becoming less attractive to company executives constantly bombarded with sentiments praising the operational capabilities of the cloud. As a result, many are considering making the transition to the environment, but lack the resources required to do so.

According to David Linthicum, a contributor to InfoWorld, cloud computing is expected to reach $235.1 billion by 2017, triple the amount spent on the technology in 2011. The expert noted that the majority of investment will be in infrastructure, application hosting and data analytics. Transitioning such a large amount of in-house data centers to the cloud will require the sophistication of database experts capable of orchestrating the integration in a timely, cost-efficient manner.

"Enterprises will use either a direct and simple port (lift-and-shift) or more complex rebuilds, where the applications are modified to use specific cloud services," the news source stated.

Linthicum acknowledged that data-as-a-service investment has skyrocketed, primarily due to two key factors.

  1. The technology database support services are offering possesses quality transaction processing and analytics applications.
  2. During the transition period, remote DBA providers allow clients to adapt to complex database tools.

Responsibility of the chief information officer
CloudTweaks reported that many CIOs invest in DBA services to help their businesses transition to the cloud. Part of their responsibility is getting employees – particularly the company's IT department – acclimated to the way the remote option operates. Another appeal to outsourcing database maintenance to a third party is that doing so alleviates stress placed on the business's IT personnel.

"The fact is that implementation of any new technology solution requires an initial outlay," the news source reported.

The CIO's primary focus should be accurately communicating the abilities of his or her team to the database administration service. As each company operates under a specific model, it's necessary for those instigating the transition to the cloud to receive information on how the enterprise operates. In addition, he or she must acknowledge what applications the business plans on utilizing in the new environment.

The news source also claimed that CIOs should look for vendors capable of performing quality tests that will identify problems before any major data-compromising issues arise.

Ultimately, preparing a team of IT professionals for the transition is the key to successfully migrating to a cloud environment. Although the technology isn't new, some 20 or 30-year veterans will have to be familiarized with new protocols.

How will mobile devices bring the cloud to new heights?

Tue, 2014-02-25 15:01

Business leaders who have dabbled in bring-your-own-device and mobile device management strategies are well aware of the powerful benefits afforded by the technology, but few have taken full advantage of the movement due to shaky employee compliance, a lack of dedicated software and a host of IT security concerns. However, there is no better match for the business world than that of mobile and cloud services, and companies are beginning to realize this as personal devices take the spotlight in many of 2014's cloud-based offerings.

The state of mobile today and its bright future
It may seem redundant to discuss the growth of mobile and the success it will continue to have down the line, but a recent article from CloudTech served to reaffirm the fact that the technology is here to stay and will only be made more influential by the influx of cloud-based services. The news source highlighted a report from Cisco predicting that mobile data traffic will grow 11 times over between now and 2018 worldwide, with an expected 190 exabyte annual run rate by that year. The number of mobile Internet connections will likely exceed 10 billion by then, surpassing the predicted world population of 7.6 billion.

"Such growth is not only indicative of mobility becoming a critical characteristic of almost every network experience and the value consumers and businesses alike place on it, but it also represents the immense opportunities ahead for service providers who sit at the center of the Internet of Everything," Doug Webster, Vice President of Products and Solutions Marketing at Cisco, told the news source.

App stores built for business take off
As the cloud has shaped its identity in the past few years, a large component of its success has been the speed with which services and applications can be implemented in nearly any business environment. This has not only made it easier for companies to get the most out of these services off the bat, but it has also allowed providers to offer trials and demos of software and infrastructure that very accurately represent the functionality of the real thing. In the mobile arena where businesses are still skeptical of applications and services, this capability is a lifesaver for vendors.

With respect to mobile, Oracle is at the head of its field once again, launching a cloud application store for its 390,000 customers to test, purchase and utilize a growing list of over 100 applications.

New Oracle apps to optimize business data

Tue, 2014-02-25 01:48

As database administration services have helped maintain business information capacity, the programs necessary for harvesting big data and making sense of it all are growing to be of equal demand. Mapping consumer purchase trends and surpassing competitors is possible with the assistance of data analytics tools. 

Because corporations are looking to combine contrasting data sets, usage of remote database support has become more commonplace. Automated Trader reported that using big data analytics applications depends on what particular practices a business employs. 

"If you're a large multinational with many disparate data sets (customer databases) and you want a single customer view, then big data is probably for you," the news source stated. "If you're a smaller company, a distributor or a professional adviser and you want to make more of your current client database then you'll benefit from good analytics." 

Oracle's contribution
According to Database Trends and Applications, software engineering company Oracle released two new applications, the PeopleSoft In-Memory Financial Position Analyzer and PeopleSoft Financial Allocations Analyzer, which are capable of leveraging the entirety of a business's customer financial data, deploying live production information and placing these operations into an in-memory model, which is supported by database experts. 

"Traditionally, CFOs have had to model their financial position by exporting information, usually in summarized form to a spreadsheet or other external analytics, resulting in data that is not complete and is less than real-time," said John Webb, vice president of Oracle PeopleSoft.

Oracle experts claim that the programs utilize a closed loop process, which eliminates the need to export information out of the enterprise system. For example, business protocol changes are evaluated in program-initiated simulations that are automatically deployed into production. Because the data is being processed and organized via the software in real-time, executives know that the information is current and accurate. 

The greatest benefit these two programs provide companies with is the ability to predict customer needs. In that respect, the software isn't the first of its kind, but the fact that it removes unnecessary data transitioning procedures is a big win for corporations with limited resources. The software isn't exclusively appropriate for large enterprises, either. Small and mid-sized businesses could benefit from the tools it provides as well. 

Getting more acquainted with customer finances can give a company a huge leg up over the competition. Being able to figure out what consumers are willing to pay for will enable businesses to adjust their marketing campaigns and determine how their products can become more affordable. 

New cyber security mandates expected to appear

Mon, 2014-02-24 09:15

Harnessing predicative analytics software often necessitates the use of database support services. An incredible amount of information that could be used to charter market events and consumer trends is now flowing into businesses' data centers quicker than ever, requiring a stable environment.  

According to Data-Informed Magazine, e-commerce sites have become more involved with their customers and are going beyond routine merchandise catalogs. Now, applications have the ability to track consumer website navigation, customize real-time promotions, construct buyer profiles and utilize insights from social media platforms to create a more interactive experience. Such activity wouldn't be possible without predicative analytics tools or remote database support. 

"Under the hood, the application is constantly connecting the dots, joining data from many fast changing data tables to create that customized experience," the news source reported. 

Consumers have grown more comfortable with e-commerce over the past couple of years, despite the threat of security breaches. The majority of the data that travels through a website is live and operational, requiring the surveillance and protection offered by database experts well versed in monitoring virtual assets. In addition, the prevalence of data analysis applications has caught the attention of the United States Congress. 

Discussions in the U.S. Senate
According to Mondaq, Senator Pat Leahy D-VT, along with Senators Chuck Schumer D-N.Y., Franken, and Blumenthal, sponsored a new bill that would require organizations handling personal information of 10,000 or more U.S. citizens to deploy rigorous protective measures, including a system that audits the performance and vulnerabilities of their security architecture. The proposed law would also require organizations' remote database support to notify federal authorities in the event of a breach compromising 5,000 people. 

A separate bill sponsored by Senators Jay Rockefeller D-W.Va., Mark Pryor D-Ark., Bill Nelson D-Fla. and Feinstein would require every business handling personal information to carry out the prescribed security measures, regardless of the amount of clients. In the event of a security breach, the law would require companies to notify consumers within 30 days. 

Despite these differences, both of the legislative proposals call for civil penalties in the event of a security or notification violation. The two bills also introduced criminal liability clauses if a company sustains a data breach and fails to notify consumers or authorities. 

If either one of the prospective bills is passed, the demand for quality database administration is ensure to increase. Having a staff of experienced, resourceful IT professionals monitoring data activity 24/7 is the best way to establish a vigilant security system. 

Northeast adopting DBA services

Mon, 2014-02-24 01:49

The 21st Century is revolutionizing the way people consume information. It's not uncommon for elementary school students to learn about their surroundings through websites, which are constantly maintained by remote database services. Many constituents often look to government entities, such as the Treasury Department, to provide facts and statistics that are held in complex data centers.

Education initiatives
According to Lohud Magazine, New York has delayed the release of EngageNY, a statewide database containing identifiable information on the Empire State's student body. The data was supposed to be disclosed in April to non-profit organization inBloom, which specializes in providing an array of resources to educators. However, the company won't receive data from EngageNY until the start of the next school year. The news source stated that the procrastination was the result of privacy concerns from parents and lawmakers.

Tom Dunn, a spokesman for the Education Department, claimed that the full set of data, which will contain names and addresses of the students, will be shared through a protected connection with EngageNY's partners two months before the full release. The program's database administration service is reevaluating its security procedures in order to ensure that all protective measures are in place and has found no potentially threatening bugs thus far.

"When all the data is submitted, the database will include information about student enrollment, attendance, behavior and test grades," Lohud stated. "Students, teachers and parents will have access, while third parties would have to access if they contact with the state for 'specific contracted educational purposes only,'" the news source reported.

Fostering opinions
The refined information that the government provides to constituents is often observed as truth. However, lawmakers are realizing that legislative conclusions only provide one perception of and given situation. The Middletown Press reported that Connecticut Gov. Dan Malloy recently signed an executive order that will divulge raw government-collected data to the public. The governor claimed that the state will seek the assistance of a remote database support group to create the online portal.

Malloy stated that the purpose of the order is to provide unrefined information so that it can be analyzed by researchers and entrepreneurs. He hopes that the initiative will help foster economic growth in cities such as Hartford and Bridgeport. The executive order requires public organizations to immediately commence accumulating data to be posted in the database.

The incentive to supply unrefined data to the public may renew constituent confidence in the government's ability to provide quality services.

Leading database vendors move in the direction of open source

Sun, 2014-02-23 15:13

Open-source software is nothing unusual in the world of IT, and its success is beginning to make an impact on the datacenter hardware landscape. A few years ago it would seem highly unlikely that any major database provider would open source its architecture and allow access to a boundless community of users, but according to BusinessCloudNews, Microsoft recently announced that it would be teaming up with Facebook on its Open Compute Project (OCP). Not only is this a significant event for one of the world's leading IT enterprises, but industry analysts are predicting that it will lead to a variety of advancements in cloud and web-scale computing and encourage other major providers to do the same.

"These servers are optimized for Windows Server software and built to handle the enormous availability, scalability and efficiency requirements of Windows Azure, our global cloud platform. They offer dramatic improvements over traditional enterprise server designs: up to 40 percent server cost savings, 15 percent power efficiency gains and 50 percent reduction in deployment and service times," Bill Laing, corporate vice president for cloud and enterprise at Microsoft, told the news source.

Why open source rules the startup world
While OCP is making headlines with its Microsoft partnership, startups worldwide are enjoying the independence and flexibility granted by open source architecture, with some even choosing to make their own code accessible to all. According to VentureBeat, this trend has made taken pressure off of small firms looking to maintain their autonomy and save on expensive platforms. There is also a greater level of reliability offered by open source, which is essential for any small firms trying to stay afloat without having to worry about faulty systems. As open source becomes even more popular, businesses at all levels can expect a faster and smarter cloud for the future.

Government expresses interest in database

Fri, 2014-02-21 01:18

Multi-platform database support has been renowned for its ability to provide businesses with the infrastructure necessary to maintain vast amounts of information and operate applications that improve day-to-day office functions. Now, the United States federal government has expressed interest in harnessing database capabilities to maintain records.

However, many organizations and U.S. constituents believe that the government has nefarious motives. According to The Washington Post, the U.S. Department of Homeland Security was interested in contracting a private database administration firm to install a nationwide information center containing data collected from license-plate readers. The news source stated that U.S. Immigration and Customs Enforcement would use the system to assist agents in locating and arresting criminal aliens, which would reduce surveillance hours and improve officer safety. 

In light of widespread privacy concerns and government mistrust, the article released by the Post convinced DHS Secretary Jeh Johnson to cancel the solicitation, CBS News reported. Gillian Christensen, a spokeswomen for Immigration and Customs Enforcement stated that the organization's leadership was unaware that the contract request had been posted to begin with. 

Citizen's perspective
The remote database monitoring solution could have stored data on every car that crossed the paths of license-plate readers, regardless of whether or not the vehicle owners possessed criminal backgrounds. Before the solicitation's dissolution was announced, The American Civil Liberties Union stated that the initiative was simply another strategy for federal authorities to collect massive amounts of data on civilians without any just cause. 

"The group has argued that constant monitoring can 'chill the exercise of our cherished rights to free speech and association,'" The Washington Post reported. 

Public service initiatives 
On the other hand, the Tustin-based California Public Policy Center recently activated a database containing detailed 2012 pension information from 31 California public pension funds. In addition, the group is currently seeking to obtain data from 38 other systems. According to Merced Sun-Star, CPPC President Mark Bucher claimed that he hopes the availability of the information will motivate constituents to advocate for improved compensation. 

Although prospective retirees expressed concern over the vulnerability of the information, Bucher claimed that a court judge ultimately decided that the data was public. 

The collection of large data stores is becoming more prevalent with each year. As private businesses continue to use predicative analytics tools in an attempt to gain competitive edge, it's no surprise that state legislatures and federal agencies are interested in how databases can help them learn about their citizens.