Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 3 hours 2 min ago

Schools witness continued use of databases

Wed, 2014-03-05 01:57

In response to the No Child Left Behind Act of 2001, educators throughout the United States have implemented ways to ensure that their students are abiding by standards established by the federal government. Some school officials, in conjunction with state departments, have consulted with database administration services to deploy networks possessing a wide variety of student data across entire districts. 

Statewide ventures
One such database has already been deployed. According to Brookfield Patch, the Connecticut State Department of Education recently established its own system for assessing and distinguishing school performance. The network, monitored by a remote DBA support firm, categorizes each institution throughout the state into one of three designations based on the average results of student Connecticut Mastery Tests, Connecticut Academic Performance Tests and other exams:

  1. "Excelling" is assigned to schools possessing an SPI over 88, a four-year graduation rate over 94 percent and at least 25 percent of students partaking in advanced classes in three of four subject areas.
  2. "Progressing" means that a school has an SPI of 88, but a 10-point gap or larger exists for underachieving subgroups and the graduation rate is between 94 to 90 percent.
  3. "Transition" is designated to institutions that have an SPI between 88 and 64. The establishments are missing either the 90 percent four-year graduation standard or a subgroup gap higher than 10 points. 

An additional three categories, "review," "focus" and "turnaround," classify schools as possessing substandard performance characteristics and are typically under scrutiny by the CSDE. 

Access to standardization 
Although a number of U.S. constituents perceive standardization to be detrimental to students, the issue isn't dissipating. According to VentureBeat, Desire2Learn, a learning software provider, recently acquired the Achievement Standards Network, an online database of education standards. 

Desire2Learn informed the news source that the organization will undertake all responsibility for operating ASN's public data infrastructure. Whether or not the company will consult with database experts on how to maintain the networks by itself has not been clarified, but it's assumed that the size of the operation will convince them to do so. 

Jeremy Auger, co-founder and chief strategy officer of Desire2Learn, claimed that he hopes the transition will give clarity to teachers confounded by the wide array of differing education standards throughout the country. The effort is a part of ASN's desire to digitize learning content and simplify the materials search process. 

The deployment of these two databases signifies a revolutionary shift in the U.S. education system. Defining stringent standards and making them available for widespread consumption is sure to solidify the policies initiated by No Child Left Behind and the Common Core. 

In-memory the future for databases?

Tue, 2014-03-04 11:11

Because of a need for simplified internal optimization algorithms and faster data access, companies collecting huge amounts of digital information are looking to operate their databases on an in-memory platform. In response, database administration services are being consulted to help businesses transition from disk storage to the more flexible solution. 

The transition to in-memory databases originated with the advent of big data and the now-impending demise of business intelligence systems. According to Apps Tech News, before BI was introduced into the business market, organizations maintained separate repositories of information. For example, database support services held company finances in a server operating an ERP system, which couldn't be compared with the consumer-related information attained through a CRM solution. 

BI allowed organizations to consolidate these assets onto a single database, enabling C-suite officials and their subordinates to access and analyze information from a variety of contributors. However, with the introduction of the cloud and the residual technologies developed under its influence, large enterprises are now turning toward data analytics tools possessing superior digital information refinement techniques. 

Large, adaptable and fast 
Companies are realizing that hosting their big data analysis programs on in-memory databases or a cloud environment is more practical than attempting to run these systems through a disk storage option. In light of this realization, database experts have adjusted their services to cater to businesses with these desires. 

InformationWeek reported that the price of RAM associated with memory-derivative systems is declining, meaning that mechanical disks will become more expensive. The former solution is capable of processing information queries 20 to 100 times faster than the latter.

Merchandising services company Edgenet claimed that the more contemporary system has brought real-time insight into product availability for customers of major construction supply chains such as Lowe's and Home Depot. 

"In-memory capabilities have let the online gaming company Bwin.party go from supporting 12,000 bets per second to supporting up to 150,000," the source stated. 

The article also noted that competition within in the industry of in-memory data processing is heating up. Oracle and its competitors are constantly keeping tabs on one another to make sure that their technology can obtain and comb through data much more efficiently than the others'. 

As organizations – particularly large-scale enterprises – take note of the aforementioned success stories, many will turn to database support services to help them install and maintain the in-memory option.

Mobile device management catches the eye of worried parents

Fri, 2014-02-28 11:43

Private enterprises and public organizations have sought the expertise of database administration professionals to provide widespread connectivity between mobile devices and central databases. The prevalence of the cloud has fostered interest in integrating tablets and smartphones into the workplace and has given executives the ability to view market changes in real-time.

According to Federal News Radio, United States government agencies and bureaus heavily entrenched in data collection and analysis are looking to capitalize on the benefits associated with mobile device management. In one instance, the U.S. Postal Service's Inspector General's office virtualized more than 400 servers in an effort to foster a bring-your-own-device (BYOD) initiative within the organization. 

In turn, the Navy is making the transition to BYOD with the help of a database administration service. The military entity told Federal News Radio that the virtualization of the organization's servers by 2017 will serve as a means to reduce database operation costs and improve access through mobile devices. The flexibility of the solution could benefit sailors operating on large vessels as well as enhance the communication capabilities of the department as a whole. 

Good luck hiding text messages from parents 
CIO acknowledged the advent of mobile device management in large enterprises, but the source also noted a more personalized technology developed by Remo Software. The article stated that the program allows parents to regulate and monitor their kids' activity on smartphones and tablets across Android, Windows 8 and iOS, as well as PCs and Macs. Essentially, the software would give users the ability to restrict teens and their younger counterparts from viewing explicit content. 

"Parents often use devices to access work files, pay bills and store confidential data that they wouldn't want in the hands of strangers," noted CIO. 

Although the technology is sure to receive an incredible amount of criticism from adolescents, the software provides guardians with the invaluable capability of knowing where their dependents are. Remo Software CEO Omer Faiyaz told CIO that parents can also view text messages, which may spawn a new form of digital language among teens looking to ensure that communications between them and their peers remain confidential. 

This new technology, along with the federal government's willingness to integrate mobile device management into its networks will foster new interest in remote database services. Getting soldiers and civilians acclimated to the new technology is sure to require the knowledge of professionals well-versed in database operations. 

Health care industry expected to invest more in database support services

Fri, 2014-02-28 02:27

The role of the chief information officer in the United States health care industry is expected to grow over the next couple of years. In response to the Affordable Care Act, U.S. hospitals, insurance providers and health clinics are transitioning to a value-based care delivery system that will force them to seek more efficient ways to provide services. Hospital CIOs are being viewed as transformative elements. Administrators are seeking ways to change facility operations instead of simply maintaining them. 

CIO Magazine reported that increased CIO responsibility will be followed by tough decisions to make and hard questions to answer. Although transitioning to an electronic health record system is an expensive solution, it is a necessary step for optimizing hospital efficiency. With the assistance of remote DBA experts, an accessible system capable of interacting with mobile devices can provide doctors and nurses with a way to obtain patient information more quickly, as opposed to perusing filing cabinets for 15 minutes. 

After implementation occurs, the news source recommended that the CIO work in conjunction with the database administration service to educate hospital personnel on how the EHR operates. Many DBA service providers conduct training sessions aimed toward employees who possess no prior experience with new programs, but the CIO's mediation should be mandatory. 

Database administration in federal authorities
According to FierceBiotechIT, the United States Food and Drug Administration's health care surveillance database, Mini-Sentinel, contains information on over 160 million individuals, 3.5 billion medication dispensaries and 3.8 billion medical encounters, surpassing the goals set by Congress, which obligated the organization to hold records for 100 million patients by July 2012. 

The incredible amount of data contained within Mini-Sentinel has necessitated the consultation of database experts to ensure that all records are kept secure and that the system operates adequately. 

One of the key elements the database provides FDA personnel with is the ability to accurately trace the effects of various medications on patients throughout the U.S. For example, utilizing Mini-Sentinel data helped staffers assess bleeding rates and thromboembolic outcomes associated with the usage of various blood-thinning medications. 

"With many patients now taking multiple drugs over many years, the potential for unexpected interactions and long-term effects has escalated," the news source reported. 

Integrating the health care industry's information into flexible, scalable databases that are adequately monitored could potentially change the way care is delivered to consumers. Throughout the next few years, the presence of the CIO and other IT professionals in hospitals and clinics will become more apparent. 

Maintaining Your Databases on SQL Server Express Edition

Thu, 2014-02-27 10:46

Do you ever click on a bookmark to find that the website is no longer available? That scenario was the inspiration for my first post. I was posed a question regarding setting up maintenance tasks for SQL Server Express Edition and my bookmark no longer worked. Luckily, I already had a local copy of the script and its usage, but I wanted to ensure that the solution can be easily found by others. You can find an executable of Jasper Smith’s original ExpressMaint utilities for SQL Server 2005/2008/2008 R2 here: http://expressmaint.codeplex.com/. Because there have been no recent updates, I have done so and made some modifications to it. This new script can be downloaded here.

Additions and Improvements

  • Updated Error Handling
  • Forced Cleanup Parameters–
  • This will ensure no backup or report files are accidentally deleted if you do not define the variables.

  • Redesigned Specifically for SQL Server Express, Express with Tools, and Express with Advanced Services Editions
  • SQL Server 2012 Support
  • Backups for all Databases Can be Performed in One Task
  • ‘Smart’ Exclusion of Database Using the Simple Recovery Model when Performing Transaction Log Backups–
  • You can use one task to perform transaction log backups without worrying even if only one database is using the simple recovery model.

  • ‘Smart” Exclusion of Master Database from Differential Backup Task

Permissions

This script will create a stored procedure named ‘expressmaint’ in the master database by default that can then be run on a schedule through the task manager with ‘xp_cmdshell’ and ‘Ole Automation Procedures’ enabled. In SQL Server 2005, the account that will run the scheduled task must be a member of the sysadmin role in your Express Edition instance. There are two ways to set up the permissions to run this stored procedure in SQL Server 2008 and higher:

1. Add the account to the sysadmin role
2. Grant the account execute permission on the ‘expressmaint’ stored procedure in the database it was
created in and db_backupoperator in the database(s) you will be backing up. You will also need to set
up the xp_cmdshell proxy account.

Parameters

There are several options for configuring your SQL Server Express Edition maintenance tasks:

Image and video hosting by TinyPic
Image and video hosting by TinyPic
Image and video hosting by TinyPic

Scheduling

Since SQL Server Express edition does not have an Agent, the only way to schedule maintenance tasks is through the Windows task scheduler. First, you will need to create a sql file calling the expressmaint stored procedure and passing in the parameters you need. Then create a new scheduled task and set the command to run as SQLCMD.exe. The parameters to pass to sqlcmd are:

Image and video hosting by TinyPic

Examples

On Windows 2003, if your SQL Server Express edition instance name is SQLExpress2008R2 and you wish to perform full backups for all databases and generate a report text file. In this example you will be keeping one week of backups in C:\Backups and one month of report files in C:\Reports. Old backups will be deleted before creating new ones. The scheduled task will be run under a username and password that has the appropriate permissions within SQL Server.

Image and video hosting by TinyPic

On Windows Server 2008, your SQL Server Express edition instance name is the default instance and you wish to use one sql file for all maintenance tasks. The below example will run full backups for all user databases and generate a report text file. You will be keeping one week of backups in C:\UserBackups and one month of report files in C:\Reports. The scheduled task will be run under a username and password that has the appropriate permissions within SQL Server.

Image and video hosting by TinyPic

Thank you for reading my first RDX blog post. Feel free to reach out to me on Google+ or leave a comment below to start a discussion. Stay tuned for my next post in the coming weeks!

Health care industry continues to rely on remote database services

Thu, 2014-02-27 01:44

Delivering affordable care to patients remains a daunting task for many in the health care industry. Some hospitals have invested in more advanced technology and insurance companies have assembled packages that look more appealing. All of this looms over the Affordable Care Act and what it means for the industry as a whole. In search for cost efficiency, hospitals and government officials are turning to networking services and remote database support to optimize patient-to-doctor communication. 

Kaiser Health News collaborated with The Seattle Times on a report claiming that Washington State is looking to promote transparency of care expenses through a new bill. Pioneered by Gov. Jay Inslee and a team of state lawmakers, the resulting amendment would produce a statewide database listing hundreds of medical procedures and what they would cost at clinics and hospitals. The system would also provide information regarding facility reliability and staff quality.

While Washington looks for a viable database administration service, some insurance companies have expressed discontent with the proposed bill. For example, Regence Blue Shield and Premera Blue Cross claim that the new requirements will disclose the treatment prices they've negotiated with clinics and hospitals in confidence. 

"The biggest question is whether patients will seek this information and use it to shop more wisely," the news article noted. "Transparency supporters think the public is ready for them to pull back the curtain on cost and quality."

Expanding networks 
In a separate endeavor, non-profit health organization Torrance Memorial Medical Center announced its implementation of a local area network to improve wireless infrastructure in its existing hospital building and six corresponding facilities. 

Health IT Security reported that the deployment isn't coming without security concerns. The company, which treats an estimated 23,000 patients annually, is seeking database experts to help ensure patient record confidentiality. Todd Felker, the organization's infrastructure and security architect, told the news source that being able to isolate the network visitors use from the one employees utilize will be the greatest challenge. 

Felker also expressed the need for mobile device protection procedures, claiming that many of those instruments are susceptible to cyber attacks. What's more alarming is that some of them are medical mechanisms. 

"There are threats such as Distributed Denial of Service as well as hackers trying to do a scan to try to take advantage of a vulnerability on a device," he told the news source. 

Between enlightening health care consumers about the exact costs facing them and giving patient visitors the ability to work while in health care facilities, networking capabilities and limitations are sure to be tested. 

Could IT security concerns be vanquished by cloud solutions?

Wed, 2014-02-26 11:29

Perhaps one of the most prohibitive aspects of cloud services is the perception that network security suffers when compared to on-premise IT solutions that keep data within direct owner control. However, industry leaders are now seeing more opportunity than ever with regard to the security capabilities of the cloud, reaffirming the technology as a force to be reckoned with in the coming years. As cloud protection measures become more substantial and reliable, companies that remain skeptical of the movement will find themselves quickly left behind if they don't adapt soon.

Key features of cloud security come into play
A few years ago, IT leaders may not have been able to deliver a great answer when confronted with questions about cloud security strategies, but times have changed as more off-premise solutions are offering a host of protection measures not available to the first generation of adopters. Despite security becoming a more urgent and complex challenge in light of recent, highly publicized data breaches, IT experts are confident that the cloud has the potential to step up and defend organizations better than ever before. According to a WhaTech article highlighting a recent Gartner report, sensitive data may be better protected in the cloud than with traditional network solutions, leaving business leaders with little left to complain about with regard to the technology. 

"The benefits cloud security offers – particularly encryption – are making it an increasingly popular choice," Kelly Kavanagh, principal research analyst at Gartner, told the news source. "The benefits of deploying cloud-based security services are clear. Aside from the broad area of IAM, specific controls such as encryption are becoming vital to the adoption of cloud computing. They are further helping to generate interest in this particular form of security service delivery."

Organizations will require specialized security solutions
As with any aspect of corporate IT, businesses will need to find the right cloud security service to suit their particular needs. This is especially important considering the customized nature of the data breaches last year, going after retailers using point-of-sale systems. 

"Enterprises will begin to shift more of their legacy and mission-critical applications to the cloud as the economics become more compelling," Dell's President's Advisory Board was quoted as stating in a recent BizJournals blog post. Security is without a doubt mission-critical in today's IT landscape. 

Just as organizations look to integrate cloud-based infrastructure, storage and mobile device management, security will soon be entrenched in the marketplace as another sought-after solution.

Application Change Management Best Practices

Wed, 2014-02-26 11:26

Database and operating system administrators are ultimately responsible for guaranteeing the quality of their organization’s information processing environments. From protecting against unauthorized access to providing 24×7 availability – “the buck stops with the support unit.” Although the database infrastructure (DB binaries, O/S, hardware) doesn’t change much, there is one component that usually changes a lot – the application. This blog post provides readers with helpful hints and tips on application change management best practices.

I started my career working in a mainframe environment. Well-known database guru, Craig Mullins, and I administered DB2 databases on huge, big iron platforms- platforms that supported thousands (upon thousands) of concurrent users. One of the benefits of this background is that it taught us the importance of change management best practices.

Craig and I learned that a key ingredient of a trouble-free mainframe environment was ensuring that there were no “surprises” when a change was implemented in production, changes that can affect the day-to-day business operations of an entire organization. Throughout my career, I have applied these “mainframe style” best practices to all other database ecosystems that I was responsible for supporting.

It works. The first company I applied these best practices to was selected by Oracle as a “Showcase Environment”. This was back in the old days, when Scott’s tiger was just a cub. Oracle identified shops that it thought had rock-solid support environments and asked them to host visitors from other organizations that wanted to create their own high-quality support infrastructures.

My current organization, Remote DBA Experts (RDX), supports over 300 customers. We are responsible for monitoring and administering thousands (and thousands) of servers. Our customers’ environments consist of a myriad of third-party and homegrown applications running on every imaginable database/operating system/hardware combination. We provide support for all major data infrastructure products including SQL Server, Oracle, Oracle EBS, DB2 LUW, DB2 mainframe, MySQL, PostgreSQL, Windows and UNIX/Linux operating systems as well as Hadoop. During our last SSAE16 audit, we were evaluated on 15,000 individual change requests. The changes ranged from “create this user” to “please build this Oracle RAC 8 Node Cluster”. We implement a lot of changes at RDX.

It is something we are very good at. I thought I would provide you with a few helpful hints and tips on our change management best practices. Since few readers will work for a remote database and operating systems services provider, I’ll tailor my recommendations to readers working in smaller and/or newer shops that many not have a complete set of change management processes in place.

Database Design Reviews

One of my earlier blog posts provides information on database design reviews. Database design review meetings foster effective communication between the DBA unit, system support personnel and application developers throughout the entire application design and implementation process. When database design issues are addressed early in the development lifecycle, problems are minimized and the migration from test to production is more easily accomplished.

If you haven’t read the design review blog, you should. I’m intentionally not covering the importance of rigorous testing of any change before it is implemented in production because it is covered in-depth in the design review blog. From simple changes to new application implementations, there is simply no reason not to perform the test, review, change, test, review, change iteration lifecycle. Although the blog post covers new application implementations, the post will show you how important I think it is to follow a rigorous test plan.

Proceduralize the Change Request Process

Database administrators usually support different business units with each unit having their own set of unique procedural requirements. Formalizing and documenting the change request process minimizes the potential for miscommunication between the business units, application development areas and the database administration unit.

The notification lead-time required for the database administration team to perform a requested change should be documented and distributed to business and application development units. This will prevent your team for getting a request to migrate a database from test to production in the morning with a required date for that afternoon. Of course, we all know that never happens. Since our customers at RDX share support personnel with each other, following our SLAs is of utmost importance. It enables RDX to provide high-quality support to all customers. We completely understand that business needs often demand quick implementations, but we make every attempt to work with our customers to help them plan their changes in advance.

We break our response time SLAs into different categories based on the complexity of the change and the amount of work it requires. We have a different lead time for simple database object changes vs. creating that Oracle RAC 8 Node Cluster I was discussing earlier in this blog.

If your organization doesn’t have a formal change request process in place (and many shops don’t), create your own. There are dozens of work request ticketing, change management and source code versioning tools available on the market today. The prices can range from thousands to tens of thousands of dollars.

Although I highly recommend these types of products, I wouldn’t let the lack of having one prevent me from formalizing the change management process. You can do the best with what you have.

OK, so you don’t have the good fortune of having a formal change management process in place. What do you do? You can begin the formalization of the change request process by:

  • Creating standardized change request documents
  • Establishing change management meetings
  • Creating Service Level Agreements (SLAs), which include change request lead and turnaround times.
Standardized Change Request Documents

Standardized request documents help to increase the quality of the change request process. The forms are sent to the support unit by the application owner of the data to be processed. The forms can be created using a variety of different products. The mechanisms can range from sophisticated ticketing systems using web forms to simple paper documents. As I said, use what’s available. It’s not the mechanism; it’s the process that is important.

Any requests not sent or signed off by the application owner should be rejected. Keep copies of all completed work requests for auditing purposes. Application owners can be virtually any member of the organization that is identified as having the authority to sign off on change requests. The most common persons are application development team leaders, section heads, department heads, etc.. At RDX, if you aren’t identified as a “Change Data Agent” for your organization, we won’t process the ticket.

Each request form contains the following common information:

  • Form identifier – naming convention that allows the form to be easily identified
  • Application name
  • Server name (for OS requests)
  • Database name (for DB requests)
  • Name and contact information of the person requesting the change
  • Request date
  • Required date (including specific time change needs to be implemented)
  • Application owner signoff
  • Data security signoff (if required by shop standards)
  • Schema Change Area
    • Schema owner of object to be changed or created
    • Object name to be changed or created
    • Object type (i.e. table, index, view) of the object to be changed or created
    • Detailed description of change requested
    • Data administratiosign off for new data objects
  • A free form request area that further describes the change. Also provides an area for non-schema changes
  • Verification procedures – other units required to verify as well as verification procedures
  • Notification procedures – who to notify when the change is complete
  • An area that the technician will complete when executing the change that contains the following information:
    • Technician executing change
    • Technician contact information
    • Date and time change was processed
    • Verification procedures followed
    • Notification procedures followed
    Here are a few examples of specific forms that will help formalize the change request process:

    Database and OS Authorization Request Form

    This form is used for requesting authorization changes to the database and/or operating system environment.

    The Database and Operating System Authorization Request Form will include all of the requestor information contained in the previous form but will also record information pertinent to authorization requests:

    • Grantee listing for security grant or revoke
    • Type of security granted or revoked

    Production Environment Change Request Form

    This form will be used for requesting the migration of database objects (databases, table spaces, tables, indexes, etc.) from test to production and the alteration of existing production objects. In addition, the form notifies the support team to perform various database, operating system and hardware parameter and environment changes in production environments.

    Each production environment change request form must have an associated test environment change request counterpart. If the change wasn’t made in test, you don’t implement it in production. To facilitate this process, the identifier for the test change request that was used to notify the support team should be provided on the production change request form.

    The production environment change request form contains the following information pertinent to production environments:

    • Test Environment Change Request Identifier- allows technician to determine if the change was implemented in test. If no change request is found, the person tasked with implementing the request needs to determine the reason why.
    • Form identifier – naming convention that allows the form to be easily identified
    • Application name
    • Server name (for OS requests)
    • Database name (for DB requests)
    • Name and contact information of the person requesting the production change
    • Request date
    • Required date (including specific time change needs to be implemented)
    • Application owner signoff
    • Data security signoff (if required by shop standards)
    • Schema Change Area
      • Schema owner of object to be migrated or altered in production
      • Object name to be altered or migrated
      • Object type (i.e. table, index, view) of the object to be altered or migrated
      • Detailed description of change requested
      • Data administration sign off (if required by shop standards)
    • A free form request area that further describes the change. Also provides an area for non-schema changes
    • Verification procedures – Other units required to verify as well as verification procedures
    • Back off procedures – What to do it the change has an adverse effect on the system or does not work “as initially thought”
    • Notification procedures – who to notify when the change is complete
    • An area that the technician will complete when executing the change that contains the following information:
      • Technician executing change
      • Technician contact information
      • Date and time change was processed
      • Verification procedures followed
      • Notification procedures followed
    Change Management Meetings

    If you read my earlier blog post on database design review meetings, you know I’m a proponent of constant communication between all units that are involved in the change management process. How often should you hold these change management meetings? You should hold them as often as you implement objects in production. If your organization makes changes to production environments on a daily basis, the meetings should be held daily. This is not as big of an imposition on your time as you may think. We provide remote database services for several very large organizations that have these change management meetings on a daily basis. The process takes about 15 to 20 minutes, not a lot of time spent to ensure that everyone knows what is happening.

    To shorten the amount of time these meetings consume and to make them as productive as possible, the following discussion items should be a standard part of the meeting’s agenda:

    • Application name being changed
    • Date and time change will be implemented
    • Change description
    • Potential business impact if the changes don’t go as expected (include both units affected and how they will be affected)
    • Verification procedures
    • Back-off procedures
    • Requestor
    • Tested by
    Service Level Agreements

    Identifying support needs and expectations is required to provide high quality support. You probably won’t be meeting all of your customers’ expectations if you don’t know what any of them are. As stated previously, each application has its own unique set of support requirements and expectations. Service Level Agreements (SLAs) help to solidify support requirements and dispel any inflated expectations a business or application development unit may have. They probably won’t be aware of your current workload and resulting work request lead times until you tell them. The support team lead should meet with each unit supported to establish a set of measurable Service Level Agreements that include work request lead times, support activities required and application performance and availability objectives.

    Wrapup

    This is by no means an all-inclusive list of activities you need to perform to standardize the change request process. It is intended to give you a head start in the right direction.

IT leaders put mobility at the forefront of their strategies

Wed, 2014-02-26 09:54

Now that mobile device management is making a name for itself as a primary pillar of enterprise IT, businesses and service providers alike are searching for the best way to prioritize mobility and get the most out of personal devices in the workplace. Approaches have ranged from free-form bring-your-own-device implementation to strict, software-based device management platforms and everything in between, but there is still work to be done before mobile earns its place as a must-have component of any company regardless of sector or trade.

Mobile device management gets personal
Every business expects something different from mobile based on the unique needs of its operations, and a main obstacle for vendors has been the integration of mobile platforms within companies' pre-existing infrastructure. Fortunately for business leaders, software applications are more customizable and specialized than ever, allowing IT departments to overcome the compatibility issues that have held their mobile strategies back in the past. In fact, the freedom to tailor mobile solutions will render many device management issues obsolete.

This is why Oracle made a splash last year when it released its mobile application development framework (ADF) and opened up the playing field for business leaders leaders trying to find the right fit for their organizations. According to FierceMobileIT, this was a major step forward for the role of mobile among the larger scheme of enterprise IT.

"The most complex, time-consuming and expensive part of any enterprise mobile application has to do with how you integrate that mobile app into the enterprise," Suhas Uliyar, vice president of mobile strategy product management at Oracle, told the news source.

Front and back-end development is key
Whether an organization is looking to integrate mobile resources exclusively within its network or allow customers and clients to use their devices as well, a customized solution will be the best approach as unique challenges and goals arise down the line. 

Database experts to assist companies in cloud migration

Wed, 2014-02-26 02:04

Operating through an on-premise data center is gradually becoming less attractive to company executives constantly bombarded with sentiments praising the operational capabilities of the cloud. As a result, many are considering making the transition to the environment, but lack the resources required to do so.

According to David Linthicum, a contributor to InfoWorld, cloud computing is expected to reach $235.1 billion by 2017, triple the amount spent on the technology in 2011. The expert noted that the majority of investment will be in infrastructure, application hosting and data analytics. Transitioning such a large amount of in-house data centers to the cloud will require the sophistication of database experts capable of orchestrating the integration in a timely, cost-efficient manner.

"Enterprises will use either a direct and simple port (lift-and-shift) or more complex rebuilds, where the applications are modified to use specific cloud services," the news source stated.

Linthicum acknowledged that data-as-a-service investment has skyrocketed, primarily due to two key factors.

  1. The technology database support services are offering possesses quality transaction processing and analytics applications.
  2. During the transition period, remote DBA providers allow clients to adapt to complex database tools.

Responsibility of the chief information officer
CloudTweaks reported that many CIOs invest in DBA services to help their businesses transition to the cloud. Part of their responsibility is getting employees – particularly the company's IT department – acclimated to the way the remote option operates. Another appeal to outsourcing database maintenance to a third party is that doing so alleviates stress placed on the business's IT personnel.

"The fact is that implementation of any new technology solution requires an initial outlay," the news source reported.

The CIO's primary focus should be accurately communicating the abilities of his or her team to the database administration service. As each company operates under a specific model, it's necessary for those instigating the transition to the cloud to receive information on how the enterprise operates. In addition, he or she must acknowledge what applications the business plans on utilizing in the new environment.

The news source also claimed that CIOs should look for vendors capable of performing quality tests that will identify problems before any major data-compromising issues arise.

Ultimately, preparing a team of IT professionals for the transition is the key to successfully migrating to a cloud environment. Although the technology isn't new, some 20 or 30-year veterans will have to be familiarized with new protocols.

How will mobile devices bring the cloud to new heights?

Tue, 2014-02-25 15:01

Business leaders who have dabbled in bring-your-own-device and mobile device management strategies are well aware of the powerful benefits afforded by the technology, but few have taken full advantage of the movement due to shaky employee compliance, a lack of dedicated software and a host of IT security concerns. However, there is no better match for the business world than that of mobile and cloud services, and companies are beginning to realize this as personal devices take the spotlight in many of 2014's cloud-based offerings.

The state of mobile today and its bright future
It may seem redundant to discuss the growth of mobile and the success it will continue to have down the line, but a recent article from CloudTech served to reaffirm the fact that the technology is here to stay and will only be made more influential by the influx of cloud-based services. The news source highlighted a report from Cisco predicting that mobile data traffic will grow 11 times over between now and 2018 worldwide, with an expected 190 exabyte annual run rate by that year. The number of mobile Internet connections will likely exceed 10 billion by then, surpassing the predicted world population of 7.6 billion.

"Such growth is not only indicative of mobility becoming a critical characteristic of almost every network experience and the value consumers and businesses alike place on it, but it also represents the immense opportunities ahead for service providers who sit at the center of the Internet of Everything," Doug Webster, Vice President of Products and Solutions Marketing at Cisco, told the news source.

App stores built for business take off
As the cloud has shaped its identity in the past few years, a large component of its success has been the speed with which services and applications can be implemented in nearly any business environment. This has not only made it easier for companies to get the most out of these services off the bat, but it has also allowed providers to offer trials and demos of software and infrastructure that very accurately represent the functionality of the real thing. In the mobile arena where businesses are still skeptical of applications and services, this capability is a lifesaver for vendors.

With respect to mobile, Oracle is at the head of its field once again, launching a cloud application store for its 390,000 customers to test, purchase and utilize a growing list of over 100 applications.

New Oracle apps to optimize business data

Tue, 2014-02-25 01:48

As database administration services have helped maintain business information capacity, the programs necessary for harvesting big data and making sense of it all are growing to be of equal demand. Mapping consumer purchase trends and surpassing competitors is possible with the assistance of data analytics tools. 

Because corporations are looking to combine contrasting data sets, usage of remote database support has become more commonplace. Automated Trader reported that using big data analytics applications depends on what particular practices a business employs. 

"If you're a large multinational with many disparate data sets (customer databases) and you want a single customer view, then big data is probably for you," the news source stated. "If you're a smaller company, a distributor or a professional adviser and you want to make more of your current client database then you'll benefit from good analytics." 

Oracle's contribution
According to Database Trends and Applications, software engineering company Oracle released two new applications, the PeopleSoft In-Memory Financial Position Analyzer and PeopleSoft Financial Allocations Analyzer, which are capable of leveraging the entirety of a business's customer financial data, deploying live production information and placing these operations into an in-memory model, which is supported by database experts. 

"Traditionally, CFOs have had to model their financial position by exporting information, usually in summarized form to a spreadsheet or other external analytics, resulting in data that is not complete and is less than real-time," said John Webb, vice president of Oracle PeopleSoft.

Oracle experts claim that the programs utilize a closed loop process, which eliminates the need to export information out of the enterprise system. For example, business protocol changes are evaluated in program-initiated simulations that are automatically deployed into production. Because the data is being processed and organized via the software in real-time, executives know that the information is current and accurate. 

The greatest benefit these two programs provide companies with is the ability to predict customer needs. In that respect, the software isn't the first of its kind, but the fact that it removes unnecessary data transitioning procedures is a big win for corporations with limited resources. The software isn't exclusively appropriate for large enterprises, either. Small and mid-sized businesses could benefit from the tools it provides as well. 

Getting more acquainted with customer finances can give a company a huge leg up over the competition. Being able to figure out what consumers are willing to pay for will enable businesses to adjust their marketing campaigns and determine how their products can become more affordable. 

New cyber security mandates expected to appear

Mon, 2014-02-24 09:15

Harnessing predicative analytics software often necessitates the use of database support services. An incredible amount of information that could be used to charter market events and consumer trends is now flowing into businesses' data centers quicker than ever, requiring a stable environment.  

According to Data-Informed Magazine, e-commerce sites have become more involved with their customers and are going beyond routine merchandise catalogs. Now, applications have the ability to track consumer website navigation, customize real-time promotions, construct buyer profiles and utilize insights from social media platforms to create a more interactive experience. Such activity wouldn't be possible without predicative analytics tools or remote database support. 

"Under the hood, the application is constantly connecting the dots, joining data from many fast changing data tables to create that customized experience," the news source reported. 

Consumers have grown more comfortable with e-commerce over the past couple of years, despite the threat of security breaches. The majority of the data that travels through a website is live and operational, requiring the surveillance and protection offered by database experts well versed in monitoring virtual assets. In addition, the prevalence of data analysis applications has caught the attention of the United States Congress. 

Discussions in the U.S. Senate
According to Mondaq, Senator Pat Leahy D-VT, along with Senators Chuck Schumer D-N.Y., Franken, and Blumenthal, sponsored a new bill that would require organizations handling personal information of 10,000 or more U.S. citizens to deploy rigorous protective measures, including a system that audits the performance and vulnerabilities of their security architecture. The proposed law would also require organizations' remote database support to notify federal authorities in the event of a breach compromising 5,000 people. 

A separate bill sponsored by Senators Jay Rockefeller D-W.Va., Mark Pryor D-Ark., Bill Nelson D-Fla. and Feinstein would require every business handling personal information to carry out the prescribed security measures, regardless of the amount of clients. In the event of a security breach, the law would require companies to notify consumers within 30 days. 

Despite these differences, both of the legislative proposals call for civil penalties in the event of a security or notification violation. The two bills also introduced criminal liability clauses if a company sustains a data breach and fails to notify consumers or authorities. 

If either one of the prospective bills is passed, the demand for quality database administration is ensure to increase. Having a staff of experienced, resourceful IT professionals monitoring data activity 24/7 is the best way to establish a vigilant security system. 

Northeast adopting DBA services

Mon, 2014-02-24 01:49

The 21st Century is revolutionizing the way people consume information. It's not uncommon for elementary school students to learn about their surroundings through websites, which are constantly maintained by remote database services. Many constituents often look to government entities, such as the Treasury Department, to provide facts and statistics that are held in complex data centers.

Education initiatives
According to Lohud Magazine, New York has delayed the release of EngageNY, a statewide database containing identifiable information on the Empire State's student body. The data was supposed to be disclosed in April to non-profit organization inBloom, which specializes in providing an array of resources to educators. However, the company won't receive data from EngageNY until the start of the next school year. The news source stated that the procrastination was the result of privacy concerns from parents and lawmakers.

Tom Dunn, a spokesman for the Education Department, claimed that the full set of data, which will contain names and addresses of the students, will be shared through a protected connection with EngageNY's partners two months before the full release. The program's database administration service is reevaluating its security procedures in order to ensure that all protective measures are in place and has found no potentially threatening bugs thus far.

"When all the data is submitted, the database will include information about student enrollment, attendance, behavior and test grades," Lohud stated. "Students, teachers and parents will have access, while third parties would have to access if they contact with the state for 'specific contracted educational purposes only,'" the news source reported.

Fostering opinions
The refined information that the government provides to constituents is often observed as truth. However, lawmakers are realizing that legislative conclusions only provide one perception of and given situation. The Middletown Press reported that Connecticut Gov. Dan Malloy recently signed an executive order that will divulge raw government-collected data to the public. The governor claimed that the state will seek the assistance of a remote database support group to create the online portal.

Malloy stated that the purpose of the order is to provide unrefined information so that it can be analyzed by researchers and entrepreneurs. He hopes that the initiative will help foster economic growth in cities such as Hartford and Bridgeport. The executive order requires public organizations to immediately commence accumulating data to be posted in the database.

The incentive to supply unrefined data to the public may renew constituent confidence in the government's ability to provide quality services.

Leading database vendors move in the direction of open source

Sun, 2014-02-23 15:13

Open-source software is nothing unusual in the world of IT, and its success is beginning to make an impact on the datacenter hardware landscape. A few years ago it would seem highly unlikely that any major database provider would open source its architecture and allow access to a boundless community of users, but according to BusinessCloudNews, Microsoft recently announced that it would be teaming up with Facebook on its Open Compute Project (OCP). Not only is this a significant event for one of the world's leading IT enterprises, but industry analysts are predicting that it will lead to a variety of advancements in cloud and web-scale computing and encourage other major providers to do the same.

"These servers are optimized for Windows Server software and built to handle the enormous availability, scalability and efficiency requirements of Windows Azure, our global cloud platform. They offer dramatic improvements over traditional enterprise server designs: up to 40 percent server cost savings, 15 percent power efficiency gains and 50 percent reduction in deployment and service times," Bill Laing, corporate vice president for cloud and enterprise at Microsoft, told the news source.

Why open source rules the startup world
While OCP is making headlines with its Microsoft partnership, startups worldwide are enjoying the independence and flexibility granted by open source architecture, with some even choosing to make their own code accessible to all. According to VentureBeat, this trend has made taken pressure off of small firms looking to maintain their autonomy and save on expensive platforms. There is also a greater level of reliability offered by open source, which is essential for any small firms trying to stay afloat without having to worry about faulty systems. As open source becomes even more popular, businesses at all levels can expect a faster and smarter cloud for the future.

Government expresses interest in database

Fri, 2014-02-21 01:18

Multi-platform database support has been renowned for its ability to provide businesses with the infrastructure necessary to maintain vast amounts of information and operate applications that improve day-to-day office functions. Now, the United States federal government has expressed interest in harnessing database capabilities to maintain records.

However, many organizations and U.S. constituents believe that the government has nefarious motives. According to The Washington Post, the U.S. Department of Homeland Security was interested in contracting a private database administration firm to install a nationwide information center containing data collected from license-plate readers. The news source stated that U.S. Immigration and Customs Enforcement would use the system to assist agents in locating and arresting criminal aliens, which would reduce surveillance hours and improve officer safety. 

In light of widespread privacy concerns and government mistrust, the article released by the Post convinced DHS Secretary Jeh Johnson to cancel the solicitation, CBS News reported. Gillian Christensen, a spokeswomen for Immigration and Customs Enforcement stated that the organization's leadership was unaware that the contract request had been posted to begin with. 

Citizen's perspective
The remote database monitoring solution could have stored data on every car that crossed the paths of license-plate readers, regardless of whether or not the vehicle owners possessed criminal backgrounds. Before the solicitation's dissolution was announced, The American Civil Liberties Union stated that the initiative was simply another strategy for federal authorities to collect massive amounts of data on civilians without any just cause. 

"The group has argued that constant monitoring can 'chill the exercise of our cherished rights to free speech and association,'" The Washington Post reported. 

Public service initiatives 
On the other hand, the Tustin-based California Public Policy Center recently activated a database containing detailed 2012 pension information from 31 California public pension funds. In addition, the group is currently seeking to obtain data from 38 other systems. According to Merced Sun-Star, CPPC President Mark Bucher claimed that he hopes the availability of the information will motivate constituents to advocate for improved compensation. 

Although prospective retirees expressed concern over the vulnerability of the information, Bucher claimed that a court judge ultimately decided that the data was public. 

The collection of large data stores is becoming more prevalent with each year. As private businesses continue to use predicative analytics tools in an attempt to gain competitive edge, it's no surprise that state legislatures and federal agencies are interested in how databases can help them learn about their citizens. 

Utah seeks to increase database security

Thu, 2014-02-20 02:21

With concern for IT security comes assumptions of on-premise databases, cloud computing and the "Big Brother" syndrome that has possessed United States enterprises and citizens. Because data is collected over an incredibly versatile range of sources, constituents are concerned over whether or not their information is exempted - especially on a cloud platform. 

According to Standard-Examiner, Barry Conover, executive director of Utah's Tax Commission, claims that the agency has taken appropriate measures, such as data encryption, to ensure that its database is well protected. Conover also noted that Target's credit card fiasco has been followed by six new federal mandates and 54 new clarifications. One regulation requires any state department that uses mobile devices to own them. 

Protection from the industry 
At a conference in San Francisco, Larry Ellison, CEO of IT developer Oracle, allayed attendee concerns of security breaches, stating that the company has not sustained database infiltration in nearly 20 years. The apprehension was sparked by allegations stating that the National Security Agency has been monitoring Oracle's cloud customers. 

"One unique aspect of Oracle's database is that administrators can't look at the data inside," Ellison informed the audience. The CEO cited a multi-platform database support system as one of the elements protecting customer information. 

According to CRN, Oracle's partners have claimed that its security is top notch. Rhos Dyke, executive vice president of California-based vendor Cloud Creek Systems, stated that his clients lauded Oracle's ability to collect and manage massive amounts of data originating from variable sources, while still maintaining confidentiality. 

Solutions from the executive 
To subvert apprehension towards the federal government, President Obama, along with White House administrators, released a list of voluntary guidelines for companies looking to improve database security on February 12. 

According to CIO Magazine, the framework, which was initially developed by the U.S. National Institute of Standards and Technology, provides a general agreement on what a quality cybersecurity program looks like. In general, IT security professionals have submitted positive reviews of the guidelines. 

Jeff Greene, senior policy counsel at a security software vendor, claimed that the framework is flexible and adaptable, and that both small and large-scale businesses can benefit from it. Another industry professional, Tom Kellerman, claims that although the framework isn't as complex as he would like to be, it establishes a solid foundation. He also noted that health care and financial services firms will be among the first businesses to adopt the executive-endorsed guidelines. 

Remote database administration boosts business performance

Thu, 2014-02-20 02:17

Although cloud migration appears to be the biggest trend in business optimization, in-house databases are adapting to the shifting technological climate. The increased complexity of these intricate infrastructures means that enterprises are looking for database administration professionals to manage their systems off-premise. This outsourcing strategy provides corporations with the ability to consolidate their resources towards delivering their services to customers, and lets remote management teams focus on supporting the operations. 

However, it's not coming without necessary adaptation. According to Brian King, a contributor to Datacenter Journal, software-defined data centers are able to offload workloads effectively and capable of dealing with system failures to reduce service downtime. Although the hardware remains necessary, it is witnessing new uses in virtualization, allowing enhanced control and greater agility.

"The data center has become critical to the efficient operation of the modern enterprise," noted King. "Internal projects, external applications, user data and everything else are now entrusted to data centers." 

The personnel factor
"The C-level and management responsible for continuous data center operations now have more headaches and concerns than ever before and they rely more than ever on the workplace individuals inside the data center," Datacenter Dynamics reported. 

This is one of the many reasons why businesses have continued to utilize remote database management systems. A company specializing in maintaining optimal data center performance has the necessary resources available to give the system their undivided attention. Big data's increasing prevalence has been accompanied by a demand for larger storage capabilities and bandwidth, meaning much more complex data centers. The task may be simply overwhelming for a 12-person IT staff to manage on its own. 

Linking communication between the enterprise and companies providing remote database monitoring solutions is the chief financial technology officer. According to Datacenter Dynamics, because the infrastructure of data centers is growing more complex, investors and stakeholders are viewing them as equal in importance to product development and profit margins. 

Standardizing the database 
The source reported that organizations such as the American Institute of Architects and the Institute of Electrical and Electronic Engineers have recognized that establishing performance requirements has pushed the database into green initiatives as well. Cognizance of eco-friendly practices could potentially help reduce data centers contribution to climate change. 

As opposed to acclimating a small or mid-sized IT department to new architectures, remote database management companies already versed in the technology will be able to deploy services much quicker and at a more efficient pace. 

Creating Dynamic Data Sources in SSRS

Wed, 2014-02-19 10:27

For my first blog post, I decided to write about a cool little project that came across my desk a few months ago. The request was to create a SSRS report that could be used to insert records to a database by supplying the user with dropdown parameter values from a list of tables. With a simple stored procedure, you can easily set up a report to insert records to a table, but there was an added requirement for the user to be able to select the site (database) that the values should come from. As the user needed to be able to select the site first, the rest of the parameters needed to be set up depending on which value they selected. You can set up the data sources dynamically using a couple different techniques, but I’ll explain how I approached it.

The first step that I took was creating a table within my DBA database to host the possible databases for each site. (If you don’t have a DBA database set up on your instance, I highly recommend it, as they can be very handy when setting up dynamic packages/reports as well as testing new processes.) I also added a bit “isActive” column to allow users to easily come in and adjust which sites should be used.

My Local Table is Set Up and Ready for the Report:

Image and video hosting by TinyPic

Once you have the table set up, you can begin creating the initial data source to be able to pull back these values. The first data source you want to create will be connecting to the instance and database that you just created your site list table in. Using the below query in your dataset, you can bring back the list of active sites for the user to choose.

Image and video hosting by TinyPic

Once you have your dataset configured, you will want to create a parameter on the report for the user to be able to select their requested site. You can create this parameter just as you would any other by right clicking on the Parameter folder then going to Add Parameter and filling in your name and Label. For the available values, you will want to add in the data set that you just created with the fields being your Database Names.

Configuration for the Database Selection Parameter:

Image and video hosting by TinyPic

Now that the user has selected their requested site, you can create the dynamic data source to be used for the rest of the parameters. To create this, you will want to add a new data source and then click on the expression button beneath edit.

Setting Up the Data Source to Use an Expression:

Image and video hosting by TinyPic

In this case, all of the databases are stored on the same server so you are able to hardcode in the server name, but you will need to be able to pass in the users selection for the database. To do this, you are able to choose the Parameters category to make the selection. The syntax for your connection string should now look similar to this:

=”Data Source=SERVERNAME; initial catalog=” & Parameters!SelectSite.Value

Expression Set Up and Ready for Use:

Image and video hosting by TinyPic

You can now use this data source for the remaining datasets to pull back any other parameters that may be needed. I’ve found that you will need to create another temporary data source to one of your databases when setting up your datasets in order to populate the available columns. Once the columns are validated, you can change the connection back to your dynamic data source and everything will be good to go.

Username Dynamically Changes Depending on Your Site Selection:

Image and video hosting by TinyPic

You can also easily modify this expression to dynamically set the Instance your report runs against instead. I have found this technique very useful when creating dashboards, as you can easily select which instance to check. What if you needed the user to be able to select both the SQL instance and the database? Well, you could store the full connection string within an added column in your table and, when setting up your parameter, use that column for your returned Value field. From there, you can adjust your data source expression to pass in the selected value.

I hope you’ve enjoyed my first blog post. If you have any feedback or questions, please leave them in the comments below or look me up on my Google+ profile! Check back soon for other great blog posts from my colleagues and me in the near future.

Point-of-sale system malware tops IT security concerns this year

Wed, 2014-02-19 05:39

By now, consumers and business leaders alike should know of the pressing threat that cyberattackers pose to retailers and financial institutions across the world. Following the major breaches at Target and Neiman Marcus this past holiday season, IT decision-makers are scrambling to reevaluate and strengthen their security measures to minimize the likelihood of a harmful attack on their organizations. But while security strategists know that they must take action to avoid the fate of losing customers' data by the millions, the workings of the point-of-sale (POS) malware that executed last year's attacks are still unknown to many. PCWorld pointed to "ChewBacca" malware as the primary variant responsible for the unprecedented magnitude of these incidents.

Playing a high-stakes game with cybercriminals
While there is not a shred of enjoyment to be had when creating defenses meant to fend off cybercriminals, Computerworld likened the process to a game of chess to demonstrate the tactical, high-pressure tasks that IT security experts face every day. On the other side of things, attackers need to stay on their game as well if they want to successfully breach the perimeter of a network. Here is the rundown of a typical attack from the perspective of a hacker looking to access company or customer data:

  1. Before any information is viewed or extracted, the attacker must first gain access to a company network, getting past that first line of defense. Computerworld mentioned that this can be done via phishing or going to a third-party source to gather usernames and passwords. This process has also been made easier by the widespread use of social media platforms which encourage users to post personal information on a public forum. This is a gold mine for hackers who come across password hints asking for a detail that can easily be found online for free.
  2. Once access is granted, attackers will find a vulnerable link in a network – ideally one that processes heavy flows of customer information including payment data, names, card verification value (CVV), account numbers, and expiration dates. Computerworld explained that once this data has been taken, it is sent to a drop point within the company's own network before being shipped off to a secure location for the cybercriminals' use. This information is then sold on the digital black market for a premium price.

Now that POS systems have been identified as a weak point in company networks, IT leaders can hopefully strengthen their security measures to prevent future breaches.