Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 4 hours 37 min ago

RDX Services: Optimization [VIDEO]

Mon, 2014-11-03 15:57

Transcript

Hi, welcome to RDX. When searching for a database administration service, it's important to look for a company that prioritizes performance, security and availability.

How does RDX deliver such a service? First, we assess all vulnerabilities and drawbacks that are preventing your environments from operating efficiently. Second, we make any applicable changes that will ensure your business software is running optimally. From there, we regularly conduct quality assurance audits to prevent any performance discrepancies from arising. 

In addition, we offer 24/7 support for every day of the year. We recognize that systems need to remain online on a continuous basis, and we're committed to making sure they remain accessible. 

Thanks for watching!

The post RDX Services: Optimization [VIDEO] appeared first on Remote DBA Experts.

Data Warehouse Appliance Offerings

Fri, 2014-10-31 11:15

Introduction

Information Technology units will continue to be challenged by the unbridled growth of their organization’s data stores. An ever-increasing amount of data needs to be extracted, cleansed, analyzed and presented to the end user community. Data volumes that were unheard of a year ago are now commonplace. Day-to-day operational systems are now storing such large amounts of data that they rival data warehouses in disk storage and administrative complexity. New trends, products, and strategies, guaranteed by vendors and industry pundits to solve large data store challenges, are unveiled on a seemingly endless basis.

Choosing the Large Data Store Ecosystem

Choosing the correct large data store ecosystem (server, storage architecture, OS, database) is critical to the success of any application that is required to store and process large volumes of data. This decision was simple when the number of alternatives available was limited. With the seemingly endless array of architectures available, that choice is no longer as clear cut. Database administrators now have more choices available to them than ever before. In order to correctly design and implement the most appropriate architecture for their organization, DBAs must evaluate and compare large data store ecosystems and not the individual products.

Traditional Large Data Store Technologies

Before we begin our discussion on the various advanced vendor offerings, we need to review the database features that are the foundation of the customized architectures we will be discussing later in this article. It is important to note that although each vendor offering certainly leverages the latest technologies available, the traditional data storage and processing features that DBAs have been utilizing for years remain critical components of the newer architectures.

Partitioning

Partitioning data into smaller disk storage subsets allows the data to be viewed as a single entity while overcoming many of the challenges associated with the management of large data objects stored on disk.

Major database vendor products offer optimizers that are partition aware and will create query plans that access only those partitioned objects needed to satisfy the query’s data request (partition pruning). This feature allows administrators to create large data stores and still provide fast access to the data.

Partitioning allows applications to take advantage of “rolling window” data operations. Rolling windows allow administrators to roll off what is no longer needed. For example, a DBA may roll off the data in the data store containing last July’s data as they add this year’s data for July. If the data is ever needed again, administrators are able to pull the data from auxiliary or offline storage devices and plug the data back into the database.

Query Parallelism

Query parallelism improves data access performance by splitting work among multiple CPUs. Most database optimizers are also parallel aware and are able to break up a single query into sub queries that access the data simultaneously.

Without parallelism, a SQL statement’s work is performed by a single process. Parallel processing allows multiple processes to work together to simultaneously process a single SQL statement or utility execution. By dividing the work necessary to process a statement among multiple CPUs, the database can execute the statement more quickly than if the work was single-threaded.

The parallel query option can dramatically improve performance for data-intensive operations associated with decision support applications or very large database environments. Symmetric multiprocessing (SMP), clustered, or massively parallel systems gain the largest performance benefits from the parallel query option because query processing can be effectively split up among many CPUs on a single system.

Advanced Hardware and Software Technologies

Let’s continue our discussion by taking a high-level look at the advanced data warehouse offerings from the three major database competitors, Oracle, Microsoft and IBM. Each of the vendors’ offerings are proprietary data warehouse ecosystems, often called appliances, that consist of hardware, OS and database components. We’ll complete our review by learning more about Hadoop’s features and benefits.

Oracle Exadata

Oracle’s Exadata Machine combines their Oracle database with intelligent data storage servers to deliver very high performance benchmarks for large data store applications. Exadata is a purpose-built warehouse ecosystem consisting of hardware, operating system and database components.

Oracle Exadata Storage Servers leverage high speed interconnects, data compression and intelligent filtering and caching features to increase data transfer performance between the database server and intelligent storage servers. In addition, the Exadata architecture is able to offload data intensive SQL statements to the storage servers to filter the results before the relevant data is returned to the database server for final processing.

Exadata uses PCI flash technology rather than flash disks. Oracle places the flash memory directly on the high speed PCI bus rather than behind slow disk controllers and directors. Each Exadata Storage Server includes 4 PCI flash cards that combine for a total of 3.2 TB of flash memory. Although the PCI flash can be utilized as traditional flash disk storage, it provides better performance when it is configured as a flash cache that sits in front of the disks. Exadata’s Smart Flash Cache will automatically cache frequently accessed data in the PCI cache, much like its traditional database buffer cache counterpart. Less popular data will continue to remain on disk. Data being sent to the PCI Flash cache is also compressed to increase storage capacity.

Exadata also offers an advanced compression feature called Hybrid Columnar Compression (HCC) to reduce storage requirements for large databases. Exadata offloads the compression/decompression workload to the processors contained in the Exadata storage servers.

These technologies enable Exadata to deliver high performance for large data stores accessed by both decision support and online operational systems. The Exadata machine runs an Oracle database which allows Oracle-based applications to be easily migrated. Oracle describes the Exadata architecture as “scale out” capable meaning multiple Exadata servers can be lashed together to increase computing and data access horsepower . Oracle RAC, as well as Oracle’s Automatic Storage Management (ASM), can be leveraged to dynamically add more processing power and disk storage.

Microsoft SQL Server PDW

SQL Server Parallel Data Warehouse (PDW) is a massively parallel processing (MPP) data warehousing appliance designed to support very large data stores. Like Oracle’s Exadata implementation, the PDW appliance’s components consist of the entire database ecosystem including hardware, operating system and database.

Database MPP architectures use a “shared-nothing” architecture, where there are multiple physical servers (nodes), with each node running an instance of the database and having its own dedicated CPU, memory and storage.

Microsoft PDW’s architecture consists of:

  • The MPP Engine
    • Responsible for generating parallel query execution plans and coordinating the workloads across the system’s compute nodes
    • Uses a SQL Server database to store metadata and configuration data for all of the databases in the architecture
    • In essence, it acts as the traffic cop and the “brain” of the PDW system
  • Computer Nodes
    • Each compute node also runs an instance of the SQL Server database
    • The compute nodes’ databases are responsible for managing the user data

As T-SQL is executed in the PDW system, the queries are broken up to run simultaneously over multiple physical nodes, which utilizes parallel execution to provide high performance data access. The key to the success when using PDW is to select the appropriate distribution columns that are used to intelligently distribute the data amongst the nodes. The ideal distribution column is one that is accessed frequently, is able to evenly distribute data based on the column’s values and has low volatility (doesn’t change a lot).

Microsoft Analytics Platform (APS)- Hadoop and SQL Server Integration

Microsoft’s Analytics Platform System (APS) combines massively parallel processing offering (PDW) with HDInsight, their version of Apache Hadoop. Microsoft has partnered with Hortonworks, a commercial Hadoop software vendor that provides a Windows-based, 100% Apache Hadoop distribution. Please see section below for more detailed information on the Hadoop engine.

Integrating a Hadoop engine into SQL Server allows Microsoft to capture, store, process and present both structured (relational) and unstructured (non-relational) data within the same logical framework. Organizations wanting to process unstructured data often turned to Apache Hadoop environments which required them to learn new data storage technologies, languages and an entirely new processing architecture.

Microsoft’s Polybase provides APS users with the ability to query both structured and non-structured data with a single T-SQL based query. APS application programmers are not required to learn MapReduce or HiveQL to access data stored in the APS platform. Organizations using APS do not incur the additional costs associated with to re-training their existing staff or hiring personnel with experience in Hadoop access methods.

IBM PureData Systems for Analytics

Not to be outdone by their database rivals, IBM also provides a proprietary appliance called IBM PureData System for Analytics. The system, powered by Netezza, once again, combines the hardware, database and storage into a single platform offering. Pure Data Analytics is an MPP system utilizing IBM Blade Servers and dedicated disk storage servers that, like its competitors, is able to intelligently distribute workloads amongst the processing nodes.

IBM leverages field-programmable gate arrays (FPGAs) which are used in their FAST engines. IBM runs the FAST engine on each node to provide compression, data filtering and ACID compliance on the Netezza systems. The real benefit of FAST is that the FPGA technology allows the engines to be custom tailored to the instructions that are being sent to them for processing. The compiler divides the query plan into executable code segments, called snippets, which are sent in parallel to the Snippet Processors for execution. The FAST engine is able to customize the filtering according to the snippet being processed.

IBM’s Cognos, Data Stage, and InfoSphere Big Insights software products are included in the offering. IBM’s goal is to provide a total warehouse solution, from ETL to final data presentation, to Pure Data Analytics users.

In addition, IBM also provides industry-specific warehouse offerings for banking, healthcare, insurance, retail and telecommunications verticals. IBM’s “industry models” are designed to reduce the time and effort needed to design data warehousing systems for the organizations in these selected business sectors. IBM provides the data warehouse design and analysis templates to accelerate the data warehouse build process. IBM consulting assists the customer to tailor the architecture to their organization’s unique business needs.

Non-Database Vendor Technologies

New “disruptive” products that compete with the traditional database vendor offerings continue to capture the market’s attention. The products range the spectrum from No-SQL products that provide easy access to unstructured data to entirely new architectures like Apache’s Hadoop.

Major database vendors will make every effort to ensure that disruptive technologies gaining market traction become an enhancement, not a replacement, for their traditional database offerings. Microsoft’s APS platform is an excellent example of this approach.

Apache Hadoop

Apache’s Hadoop is a software framework that supports data-intensive distributed applications under a free license. The Hadoop software clusters’ commodity servers offer scalable and affordable large-data storage and distributed processing features in a single architecture.

A Hadoop cluster consists of a single master and multiple worker nodes. The master provides job control and scheduling services to the worker nodes. Worker nodes provide storage and computing services. The architecture is distributed, in that the nodes do not share memory or disk.

A distributed architecture allows computing horsepower and storage capacity to be added without disrupting on-going operations. Hadoop’s controlling programs keep track of the data located on the distributed servers. In addition, Hadoop provides multiple copies of the data to ensure data accessibility and fault tolerance.

Hadoop connects seamlessly to every major RDBMS through open-standard connectors providing developers and analysts with transparent access through tools they are familiar with. When used simply as a large-data storage location, it is accessible through a variety of standards-based methods such as FUSE or HTTP. Hadoop also offers an integrated stack of analytical tools, file system management and administration software to allow for native exploration and mining of data.

The Hadoop architecture is able to efficiently distribute processing workloads amongst dozens and hundreds of cost-effective worker nodes. This capability dramatically improves the performance of applications accessing large data stores. Hadoop support professionals view hundreds of gigabytes as small data stores and regularly build Hadoop architectures that access terabytes and petabytes of structured and unstructured data.

One of Hadoop’s biggest advantages is speed. Hadoop is able to generate reports in a fraction of the time required by traditional database processing engines. The reductions can be measured by orders of magnitude. Because of this access speed, Hadoop is quickly gaining acceptance in the IT community as a leading alternative to traditional database systems when large data store technologies are being evaluated.

Wrap-up

As stated previously, there is an endless array of offerings that focus on addressing large data store challenges. Large data store architecture selection is the most important decision that is made during the warehouse development project. A correctly chosen architecture will allow the application to perform to expectations, have the desired functionality and be easily monitored and administered. Incorrect architecture decisions may cause one or more of the following problems to occur: poor performance, limited functionality, high total cost of ownership, complex administration and tuning, lack of scalability, poor vendor support, poor reliability/availability and so on. All market- leading database vendors understand the importance of addressing the challenges inherent with large data stores and have released new products and product enhancements designed to simplify administration and improve performance.

The post Data Warehouse Appliance Offerings appeared first on Remote DBA Experts.

Open Source Virtualization Project at Risk [VIDEO]

Fri, 2014-10-31 04:02

Transcript

Hi, welcome to RDX. Virtualization and cloud technology pretty much go hand-in-hand.

Many popular cloud providers, such as Amazon and Rackspace, use Xen, an open-source virtualization platform to optimize their environments.

According to Ars Technica, those behind the Xen Project recently released a warning to those using its platform. Apparently, a flaw within the program's hypervisor allows cybercriminals to corrupt a Xen virtual machine, or VM. From there, perpetrators could read information stored on the VM, or cause the server hosting it to crash. Monitoring databases hosted on Xen VMs is just one necessary step companies should take. Reevaluating access permissions and reinforcing encryption should also be priorities. 

Thanks for watching! Be sure to visit us next time for any other advice on security vulnerabilities. 

The post Open Source Virtualization Project at Risk [VIDEO] appeared first on Remote DBA Experts.

Oracle upgrades MySQL’s capabilities

Mon, 2014-10-27 01:26

In the era of big data, database administration services are finding ways to work with NoSQL environments, Hadoop and other solutions. 

Yet, these professionals aren't so quick to write off the capabilities of open source relational databases. When Oracle purchased the rights to distribute MySQL, a few critics were a bit skeptical of the software giant's intentions. Evidently, the company intends to refine and improve MySQL as much as possible. 

MySQL Fabric 1.5
I Programmer noted one feature that allows DBAs to better manage collections of MySQL databases, dubbed MySQL Fabric 1.5. Through OpenStack, an open source cloud computing software platform used to support Infrastructure-as-a-Service solutions, Fabric 1.5 users can employ a wider range of sharding keys.

One of the most notable functions Fabric 1.5 has to offer is its ability to automatically detect failures and then employ failover through MySQL Replication. Basically, if the server for the master database unexpectedly shuts down, Fabric 1.5 chooses a slave database to become the new master. 

Cloud provisions 
For various reasons, more organizations are hosting databases in cloud environments. Managing servers distributed across broad infrastructures requires tools that can quickly identify and assign tasks to particular machines, and Oracle has recognized this.

Christopher Tozzi, a contributor to The VAR Guy, acknowledged that Oracle's Enterprise Manager can now support MySQL, allowing enterprises using the solution to better monitor and administrate database functions for public and private cloud deployments. A statement released by the developer asserted enterprise manager also allows users to migrate to MySQL technology. 

Ulf Wendel's contribution 
Another feature DBAs are sure to benefit from is a plugin that enables Web and JavaScript developers to connect to MySQL using HTTP. Oracle's Senior Software Engineer Ulf Wendel developed the add-on, which allows users to deliver SQL queries through URLs, which are then returned as a JSON document that displays information regarding the columns, rows and sets in specific databases. 

The post Oracle upgrades MySQL’s capabilities appeared first on Remote DBA Experts.

Database administrators may have to worry about POODLE attacks

Fri, 2014-10-24 11:18

Encryption is a regular component of database security, point-of-sale implementations, network protection and a plethora of other IT considerations. 

One protocol, SSL 3.0, was recently deemed sub-par. Dark Reading noted that Google experts discovered a vulnerability in the nearly 15-year-old encoding rule that could potentially allow cybercriminals to initiate man-in-the-middle attacks against users. 

What is "man-in-the-middle"?
MITM intrusions are some of the most malicious attacks organizations can sustain. According to Computer Hope, MITM occurs when a user disrupts the path between an entity sending information and the object or person receiving the data. 

For example, if Person A delivered an email to Person C, then Person B could initiate a MITM attack, manipulate the message however he or she sees fit, and then transfer the email to Person C. As one can see, this capability is quite dangerous. 

A fault for the skilled 
Google researchers dubbed the vulnerability (CVE-20140-3566), naming the type of attack a person would launch to take advantage of this fault as Padding Oracle On Downgraded Legacy Encryption, or POODLE. Apparently, a POODLE infiltration would be incredibly difficult to pull off, meaning only the most experienced hackers are capable of using the method to their advantage. 

Next steps 
Although SSL was replaced by updated encryption protocols, it's still employed to support antiquated software and older client servers. Nevertheless, these applications and machines likely hold valuable information for many companies, and enterprises should strongly consider consulting database administration services to apply revision and new data encoding processes. 

As far as vendor-related services go, Google will remove SSL 3.0 from its client programs, while Mozilla intends to do the same on November 25. 

Despite this cause for concern, WhiteOps Chief Scientist Dan Kaminsky assured Dark Reading that it's "not as bad as Heartbleed," but still a consideration companies should take seriously. 

The post Database administrators may have to worry about POODLE attacks appeared first on Remote DBA Experts.

RDX services: Platform-specific Expertise [VIDEO]

Fri, 2014-10-24 04:28

Transcript

Hi, welcome to RDX. RDX has a wide range of platform-specific experience to help keep your database environment highly available and high performance. Our DBAs can help supplement any gaps in skill sets, leaving your internal team to focus on the tasks they do best.

Whether you prefer to store information in SQL Server, Oracle, MySQL, PostgreSQL or Hyperion/Essbase, our specialists provide you with a wealth of expertise and support. Our staff is well-versed in optimizing and protecting all of these environments 24×7, providing your business with a greater peace of mind.

In addition to our varied expertise, we provide clients with the choice of customizing their environments. We’re here to accommodate any of your unique business needs, and our DBA experts are equipped to solve your toughest challenges.

Thanks for watching. Be sure to watch next time.
 

The post RDX services: Platform-specific Expertise [VIDEO] appeared first on Remote DBA Experts.

RDX Services: Proactive Monitoring [VIDEO]

Wed, 2014-10-22 07:29

Transcript

Hi, welcome to RDX. When a mission-critical system becomes unavailable, it can threaten the survivability of an organization. That’s why RDX has a Database Operations Center team responsible for the proactive monitoring of all clients’ environments, 24×7.

Our monitors are custom tailored for every environment we support, and our specialists are trained in database and operating system problem resolution. This combination delivers peace of mind for our clients when they know the Database Operation Center is watching out for their highly available, high performance, and mission-critical environments. If a major problem does transpire, our experts notify the client immediately – creating a game plan on how to resolve the situation.

Thanks for watching! Next time, we'll discuss our platform-specific solutions.

The post RDX Services: Proactive Monitoring [VIDEO] appeared first on Remote DBA Experts.

Oracle Makes 155 fixes for Patch Tuesday [VIDEO]

Tue, 2014-10-21 12:33

Transcript

Welcome to RDX! For those using Oracle Products, Oracle’s October critical patch update contains an unusually high number of security bug fixes.

ZDNet contributor Liam Tun noted that Oracle released patches for 155 security flaws for 44 of its products October 14th. Fixes include 25 security fixes for Java SE. The components affected include Java SE, Java SE embedded, JavaFX and JRockit. The highest Common Vulnerability Scoring System (CVSS) rating among the Java fixes was a 10, the highest rating available.

Also included are 32 fixes to Oracle Database Server products, with at least one receiving a CVSS rating of 9, 17 fixes for Oracle Fusion Middleware, 4 fixes for Oracle Retail Applications, 15 fixes for Oracle Sun Systems Product Suite and 24 for Oracle MySQL.

Many of these vulnerabilities may be remotely exploitable without authentication.

Thanks for watching!

The post Oracle Makes 155 fixes for Patch Tuesday [VIDEO] appeared first on Remote DBA Experts.

RDX Services: Full DBA Support [VIDEO]

Mon, 2014-10-20 13:23

Transcript

Hi, and welcome to RDX. In this portion of our "services" series, we'll discuss how we provide companies with all of their database administration needs.

With RDX's full DBA support services, we become your DBA team and assume complete responsibility for the functionality, security, availability and performance of your database environments. We know that each company has unique goals and demands, which is why we also implement guidelines and protocols based on your organization's specific requirements.

In addition, we're willing to fill in any DBA role from our offerings that your company may need. You get the expertise and best practices of over 100 DBA experts for less than the cost of a single in-house resource.

Thanks for watching! Stay tuned for other ways to work with RDX soon.

The post RDX Services: Full DBA Support [VIDEO] appeared first on Remote DBA Experts.

Public, private health care systems possess security vulnerabilities

Fri, 2014-10-17 11:12

System and database administrators from health care institutions are facing several challenges.

On one hand, many are obligated to migrate legacy applications to state-of-the-art electronic health record solutions. In addition, they need to ensure the information contained in those environments is protected.

Operating systems, network configurations and a wealth of other factors can either make or break security architectures. If these components are unable to receive frequent updates from vendor-certified developers, it can cause nightmares for database administration professionals. 

Windows XP no longer a valid option 
When Microsoft ceased to provide to support for Windows XP in early April, not as many businesses upgraded to Windows 7 or 8 as the software vendor's leaders had hoped. This means those using XP will not receive regular security updates, leaving them open to attacks as hackers work to find vulnerabilities with the OS.

Despite continuous warnings from Microsoft and the IT community, Information Security Buzz contributor Rebecca Herold believes that the a large percentage of medical devices currently in use are running XP. Her allegations are based on reports submitted by health care electronics producers that stated they leverage XP for the sensors' graphical user interfaces, as well as to create a connection to external databases. 

Because Microsoft has yet to release the source code of XP, health care companies using these implementations have no way of identifying vulnerabilities independently. Even if the source code was distributed, it's unlikely that the majority of medical providers could use in-house resources to search for security flaws. The only way to defend the servers linked with devices running XP is to employ database active monitoring. 

Public sector experiencing vulnerabilities 
Healthcare.gov apparently isn't picture-perfect, either. Fed Scoop reported that white hat hackers working for the U.S. Department of Health and Human Services' Office of the Inspector General discovered that personally identifiable information was secured, but some data controlled by the Centers for Medicare and Medicaid Services lacked adequate protection

After an assessment of CMS and databases was completed, the IG advised the organization to encode files with an algorithm approved by Federal Information Processing Standards 140-2. However, authorities at the CMS deduced this wasn't necessary. 

Although this wasn't the first audit of Healthcare.gov (and it likely won't be the last), the information held within its servers is too valuable for cybercriminals to ignore. Setting up an automated, yet sophisticated intrusion detection program to notify DBAs when user activity appears inconsistent is a step the CMS should strongly consider taking. 

The post Public, private health care systems possess security vulnerabilities appeared first on Remote DBA Experts.

JPMorgan hack joins list of largest data breaches in history [VIDEO]

Fri, 2014-10-17 09:02

Transcript

Hi, welcome to RDX. With news about data breaches sweeping the Web on a regular basis, it's no surprise that the latest victim was a major U.S. bank.

According to Bloomberg, hackers gained access to a server operated by JPMorgan Chase, stealing data on 76 million homes and 7 million small businesses.

After further investigation, the FBI discovered the hackers gained access to a server lacking two-factor authentication. From there, the hackers found fractures in the bank's custom software, through which JPMorgan's security team unknowingly gave them access to data banks.

To prevent such attacks from occurring, firms should regularly assess their databases and solutions to find vulnerabilities.

Thanks for watching! Be sure to visit us next time for info on RDX's security services.

The post JPMorgan hack joins list of largest data breaches in history [VIDEO] appeared first on Remote DBA Experts.

Need Experts in Open Source Databases? [VIDEO]

Thu, 2014-10-16 10:27

Transcript

Have what it takes to enhance your open source databases?

Welcome back to the RDX blog. Whether you prefer to store your information in MySQL or PostgreSQL, we can provide you will a complete range of administrative support.

In addition to 24×7 onshore and remote service, our staff can deploy sophisticated monitoring architectures customized to fit both MySQL and PostgreSQL. This ensures your data is available at all times.

Speaking of accessibility, our experts are well-versed in advanced PostgreSQL tools, such as the new Foreign Data Wrapper. According to Silicon Angle, this function enables staff to easily pull remote objects stored in analytic clusters.

Thanks for watching, and be sure to join us next time. 

The post Need Experts in Open Source Databases? [VIDEO] appeared first on Remote DBA Experts.

What Do You Need to Secure EHR? [VIDEO]

Tue, 2014-10-14 11:29

Transcript

Electronic health records are becoming a regular part of the healthcare industry, but are organizations taking the right measures to secure them?

Hi, welcome to RDX. EHR systems can help doctors and other medical experts monumentally enhance patient treatment, but they also pose serious security risks.

SC Magazine reported an employee of Memorial Hermann Health System in Houston accessed more than 10,000 patient records over the course of six years. Social Security Numbers, dates of birth and other information was stolen.

In order to deter such incidents from occurring, health care organizations must employ active security monitoring of their databases. That way, suspicious activity can readily be identified and acted upon.

Thanks for watching! Be sure to join us next time for more security best practices and tips.

The post What Do You Need to Secure EHR? [VIDEO] appeared first on Remote DBA Experts.

What the Internet of Things means for database administration

Fri, 2014-10-10 08:41

When it comes to storing data produced by Internet-connected devices, relational databases may not make the cut.

The push for NoSQL environments is growing in tandem with the pace at which wearable machines and industrial sensors is increasing. Database administration services are customizing systems to fit the needs of manufacturers, hospitals, retailers and other companies investing heavily in the Internet of Things.

The rise of the Industrial Internet
General Electric has expressed a considerable amount of interest in the technology, equipping its wind turbines, jet engines, locomotives and other machines to boost efficiency. The goal is to develop smarter implementations that will cumulatively save economic sectors hundreds of billions of dollars.

To achieve this goal, GE is partnering with Cisco, Intel and Verizon to create intelligent factories and public infrastructures, according to InformationWeek. For example, while Cisco's rugged routers will be placed throughout a facility, Intel's chips and servers will support computations. To enable GE's sensors to use these assets, Verizon will provide networking. With this environment, GE's devices will be able to perform predictive maintenance, power optimization and other tasks.

GE isn't preventing organizations from customizing these platforms, either. InformationWeek noted the company is developing Predix, a data aggregation and processing solution that's based on open-source technologies. This enables GE customers to create their own algorithms, code and data sources.

How it affects database administration
As distributed machines communicate with one another, produce native data and draw conclusions of their own, storing such information in SQL databases isn't the best option, noted IoT World contributor Sherly Mendoza. The issue comes down to how these environments store information

While SQL systems offer consistency and reliability, Mendoza maintained they require a lot of processing power to manage indexes and buffers as well as lock, latch and log data. In addition, although database experts can mitigate this problem, relational databases aren't easy to scale.

The solution lies in non-relational servers, which don't require administrators to assign information to schemes. The result is an architecture comprised of simple data structures that can be manipulated and adjusted as professionals require them. Because the data produced by the IoT is primarily unstructured, this makes NoSQL databases the optimal choice.

One key advantage of NoSQL environments is their ability to pull and organize data distributed across multiple servers. This means they can also scale across two machines or more, making hardware capacity less of a concern.

Preparing for the IoT boom requires considerable knowledge of non-relational databases, as well as the staff capable of enhancing these implementations.

The post What the Internet of Things means for database administration appeared first on Remote DBA Experts.

Oracle embraces open source technology [VIDEO]

Fri, 2014-10-10 07:47

Transcript

Want the benefit of managing an open source database? Oracle's looking to help you.

Welcome to RDX. Oracle Enterprise Manager will now include administration and monitoring functions for MySQL, the world's most popular open source relational database.

Database Trends and Applications noted that cloud and on-premise MySQL versions will receive a new line of backup, auditing and encryption functions to help database experts reinforce security. In addition, Oracle Enterprise Manager enables MySQL database users to monitor availability, measure performance and boost performance, as well as configure thresholds.

Thanks for watching

The post Oracle embraces open source technology [VIDEO] appeared first on Remote DBA Experts.

How to keep databases up and running during the holidays [VIDEO]

Wed, 2014-10-08 13:14

Transcript

Welcome to RDX. For our retail customers, the holiday season is a critical time of year for revenue generation. The increased activity can put additional stress on transactional databases.

Here are some best practice suggestions to ensure your databases are ready for the holiday season from RDX Director of Technical Sales, Katy Park:

Put in a High Availability solution if you do not have one.

Secondly, run a test of your DR plan to ensure you can meet your time to recovery objectives.

Ask your DBA for code tuning suggestions for queries that are run often and utilize a lot of resources.

You should also consider removing the reporting load from your transactional database if reports are currently running on the production server.

And finally, review object sizes and maximum server capacities.

Thanks for watching, and we'll see you next time!

The post How to keep databases up and running during the holidays [VIDEO] appeared first on Remote DBA Experts.

Health care an open target for hackers [VIDEO]

Fri, 2014-10-03 13:31

Transcript

Think hackers are only after you credit card numbers? Think again.

Hi, welcome to RDX. While the U.S. health care industry is required by law to secure patient information, many organizations are only taking basic protective measures.

According to Reuters, the FBI stated Chinese cybercriminals had broken into a health care organization's database and stole personal information on about 4.5 million patients. Names, birth dates, policy numbers, billing information and other data can be easily accessed by persistent hackers.

Databases holding this information need to employ active monitoring and automated surveillance tools to ensure unrestricted access isn't allowed. In addition, encrypting patient files is a critical next step.

Thanks for watching. For more security tips, be sure to check in frequently.  

The post Health care an open target for hackers [VIDEO] appeared first on Remote DBA Experts.

What do businesses need to prepare for cloud migration?

Thu, 2014-10-02 13:47

Whether to host applications or increase storage, migrating workloads to cloud environments is a consistent trend. However, many database support services are discovering that businesses unfamiliar with the technology often don't know where to begin. 

It appears more enterprises will need guidance in the near future. Business Cloud News conducted a survey of 312 IT professionals across the United Kingdom, Europe and North America, finding 40 percent of participants believe 30 to 70 percent of their IT assets will be hosted in the cloud in the next two years. 

So, what are some pain points interested parties should be cognizant of? 

1. A lack of in-house capabilities 
It's a point organizations have made in the past, but still deserves acknowledgement. Although in-house IT staff members are capable of sanctioning the transition from on-premise systems to a cloud environment, many require extensive instruction before they can do so. Even after training is completed, their lack of experience will likely cause interruptions. 

In this regard, outsourcing is a safe choice. Hiring remote DBA experts to work with existing teams to migrate all applications and storage to a cloud infrastructure will expedite the process while also ensuring long-term issues don't persist. 

2. Look at what applications are connected to 
Hybrid cloud deployments are quite common among organizations that want to host a portion of their it assets in the cloud, but retain full back-end control over critical applications. 

Suppose a company leverages that leverages a hybrid environment wants to transition its enterprise resource management solution to a hosted atmosphere. However, the ERP's file stores reside in on-premise servers. In order for the ERP solution to undergo migration, the file stores it depends on to operate must be relocated beforehand. 

3. Observe indirect connections
Some on-premise deployments may seem alienated from other implementations but encounter hindrances when operating in the cloud. TechTarget noted one example detailed by Robert Green, principal cloud strategist at IT consultancy Enfinitum, who stated one of the firm's clients migrated an application to a public cloud environment without conducting a thorough assessment prior to initiation. 

What the company failed to recognize was that on-premise firewalls that assessed and filtered Internet traffic would directly impact its employees' ability to access the cloud-hosted application. When 400 users attempted to use the software, the firewalls became overloaded. In the end, the Enfinitum client lost $10 million because its workers were unable to use the application. 

If these three points are carefully considered, enterprises will be successful in all their cloud migration endeavors. 

The post What do businesses need to prepare for cloud migration? appeared first on Remote DBA Experts.

What to expect at this year’s Gartner Symposium [VIDEO]

Tue, 2014-09-30 15:01

Transcript

Unsure of how IT will impact enterprises in the near future?

Hi, welcome back to RDX! CIOs will probably encounter a number of challenges in the years ahead. The Gartner Symposium will feature presentations on strategic IT procurement, critical industry trends and how businesses can gain value from the latest technologies.

The conference will be held at the Dolphin Hotel in Orlando, Florida from October 5th to the 9th. Notable speakers will be Microsoft CEO Satya Nadella and Lyft Inc. President and Co-Founder John Zimmer.

As you can imagine, we'll be informing attendees about our database monitoring and optimization services. If you want to find us, we'll be located at Booth 206 during show floor hours.

Thanks for watching! Can't wait to see you in Florida!

The post What to expect at this year’s Gartner Symposium [VIDEO] appeared first on Remote DBA Experts.

RDX IT Process Automation

Tue, 2014-09-30 14:11

RDX’s IT Process Automation Strategy

Remote DBA Experts (RDX) is the largest pure-play provider of remote data infrastructure services. We have been providing remote services for over 20 years, which also makes us one of the pioneers in this space. We currently support hundreds of customers and thousands of database implementations.

Remote data infrastructure services is an extremely competitive market arena. Our competitors range from “2 guys in a garage” to major outsourcing providers like IBM and Oracle. Improving and enhancing our support architecture isn’t something beneficial to RDX; it is critical to our competitive survival.

One of our primary responsibilities at RDX is to research, and evaluate, leading-edge OS, database and application support technologies. The goal of these efforts is to ensure that RDX customers continue to receive the highest level of value from RDX’s support services. RDX’s strategy is to continue to be pioneers in the remote services space – just as we were 20 years ago. One of the key technologies that RDX is implementing to ensure our continued leadership as a remote services provider is IT Process Automation.

What is IT Process Automation?

Process automation, because of its wide range of application, takes many forms. Manufacturing companies have been using industrial robots to replace activities traditionally performed by humans for some time. Business process automation shares the same goal: to replace business functions performed by humans with software applications. Work activities that are repetitive in nature and require little intelligent analysis and decision making to complete are prime candidates for process automation.

Business software applications, by their essence, are designed to automate processes. Software programmers create intelligent decision trees to evaluate and refine stored data elements and display that processed data for human interaction or automate the decision making process entirely.

Automation products are designed to act upon stored data or capture it for processing. The data is analyzed using workflows (decision trees) and embedded rules. The automation product then performs a prescribed set of actions. The automation product can continue processing by executing additional workflows, prompt for human intervention or complete the process by performing an activity.

For the context of this article, IT Process automation is the implementation of software to programmatically automate routine (little decision making required), repetitive workflows and tasks performed by IT knowledge workers.

The Automation Tool Marketplace

A highly competitive market forces all automation vendors to accelerate the release of new products as well as enhancements to existing offerings. Automation vendors know that new features and functionalities are not a requirement for competitive advantage; they are a requirement for competitive survival. The more competitive the space, the greater the benefit to the consumer. Vendor competition will ensure that automation products become more intelligent, more cost effective and easier to implement and administer.

As the number of features provided by automation products grows, so does the importance of taking advantage of those new features. Automation product licensing and vendor maintenance contracts command a premium price in the marketplace. To gain the most return on their investment, companies must ensure that they are completely leveraging the benefits of the particular automation product being used. Understanding all of the inherent features is important, but selecting the features that bring each individual implementation the most benefit is the key to success.

The endless array of automation offerings add complexity to product selection. IT automation product features and functionality range the spectrum from niche offerings that focus on automating a very well-defined, specific set of tasks to products that provide a complete framework and set of tools designed to generate more global efficiencies by automating a wide range of activities. More traditional software vendors including database and monitoring tool providers realize that automation features provide their offerings with an advantage over competitors’ products.

RDX’s Automation Strategy

Process automation products have been on RDX’s technological radar for years. Various products provided bits and pieces of the functionality we required, but we were unable to identify an offering that provided a total automation solution.

Like many shops, RDX inter-weaved various scripts, programs and third-party products to automate repetitive tasks. Automation was done in an AD-HOC, opportunistic manner as the tasks were identified. RDX’s challenge was to select and implement a product that would provide a framework, architecture and set of tools that RDX could utilize to implement a company-wide automation architecture. The goal was to transform RDX’s automation activities from opportunistic and AD-HOC to a strategic initiative with a well-defined mission statement, clear set of achievable goals and detailed project plans with deliverables to obtain them.

RDX’s Process Automation Goals

RDX has two primary sources of repetitive tasks:

  • Customer event data collection, diagnosis and resolution
  • Internal support activities

Our goals for our automation strategy can be summarized into the following main points:

  • Improve the quality and speed of problem event analysis and resolution. Faster and higher quality problem resolution equals happy RDX customers.
  • Increase staff productivity by reducing the number of mundane, repetitive tasks the RDX staff is required to perform
  • Reduce operating costs through automation

Our environment is not entirely unique. Our service architecture can be compared to any IT shop that supports a large number of disparate environments. The resulting challenges we face are fairly common to any IT service provider:

  • RDX‘s desire to provide immediate resolutions to all performance and availability issues (reduce Mean Time to Resolution)
  • RDX looking to respond to client events with more accuracy
  • Implement a software solution that allows RDX to capture and record pockets of tribal knowledge and leverage that subject matter expertise by transforming it into automated processes to foster a culture of continuous process improvement
  • Reduce the amount of time RDX spends on both customer-facing and internal repetitive tasks to allow our support professionals to focus on higher ROI support activities
  • Provide the ability to quickly prove audit and compliance standards through report logs capturing the results of each automation task
  • RDX’s rapid growth requires us to process an exponentially increasing number of event alerts and administrative activities. The continuous hiring of additional resources to manage processes and data is not a scalable or cost-effective solution

RDX’s Automation Product Selection

RDX performed a traditional vendor analysis using a standardized evaluation methodology. A methodology can be loosely defined as a body of best practices, processes and rules used to accomplish a given task. The task in this case is to evaluate and select an automation product provider.

A needs analysis was performed to generate a weighted set of functional and technical requirements. The focus of the analysis was on selecting a product that would help us achieve our goal of implementing a strategic automation solution, as opposed to just buying a product. If we were unable to identify a solution that met our requirements, we were willing to delay the vendor selection process until we found one that did.

RDX selected GEN-E Resolve as our automation tool provider. GEN-E Resolve was able to provide the “end-to-end” architecture we required to automate both customer event resolution and RDX internal processes. GEN-E Resolve’s primary focus is on the automation of complex incident resolution and is a popular product with large telecommunication providers that support thousands of remote devices. What RDX found most beneficial was that the product did not require the installation of any software on our customers’ servers. All processing is performed on RDX’s Resolve servers running at our data center.

RDX’s First Step – Automatic Event Data Collection

The primary service we provide to our customers is ensuring their database systems are available at all times and performing as expected. Database administrators, by the very essence of our job descriptions, are the protectors of the organization’s core data assets. We are tasked with ensuring key data stores are continuously available. However, ensuring that data is available on a 24 x 7 basis is a wonderfully complex task.

When a mission-critical database application becomes unavailable, it can threaten the survivability of the organization. The financial impact of downtime is not the only issue that faces companies that have critical applications that are offline. Loss of customer goodwill, bad press, idle employees and legal penalties (lawsuits, fines, etc.) must also be considered.

It is up to the database administrator to recommend and implement technical solutions that deal with these unforeseen “technology disruptions.” When they do occur, it is our responsibility as DBAs to restore the operational functionality of the failed systems as quickly as possible.

RDX’s initial goal was to automate the collection of information required to perform problem analysis. The key to faster problem resolution is to reduce the amount of time collecting diagnostic data and spend that time analyzing it.

RDX prioritized customer events using the following criteria:

  • Frequency the event occurs
  • Severity of customer impact
  • Amount of time required to manually collect diagnostic data (reduce Mean Time to Resolution)
  • Complexity of the diagnostic data collection process (increase resolution accuracy)
  • Amount of human interaction required to collect diagnostic data (cost reduction)

RDX deployed a team of in-house automation specialists to collect the operational knowledge required to create the decision trees, workflows and data collection activities traditionally performed by RDX personnel. Our implementation, although still in its infancy, has met our initial expectations.

RDX has automated the diagnostic data collection process for several events and has proven that the automation tool can perform the tasks quickly, consistently and with high quality. RDX has also successfully implemented automatic problem resolution tasks for simple events. Subsequent enhancements to our automation capabilities are to leverage RDX’s collective operational knowledge to quickly resolve more complex issues.

Although our initial goal was to improve the speed and quality of our problem resolution process, our intent is to also move forward with the automation of our internal support processes. One of the key facets of the project’s success was to keep RDX personnel informed about the automation project and the benefits the implementation would provide to both RDX customers and internal support technicians. Promoting the product was crucial, as we found that it led to the generation of a veritable groundswell of internal process automation recommendations. Our intent is to formalize the internal process automation project by appointing RDX personnel as project owners and soliciting recommendations through company surveys (as opposed to an AD-HOC manner). Once the recommendations are collected, RDX will perform the same type of prioritization as we did during the initial stages of product implementation.

The Future of Automation

Although we will continue to see the greatest advances in automation in the traditional manufacturing spaces, IT process automation will continue to grow and mature until it becomes integrated into the fabric of most IT organizations. Larger shops will be the early adopters of IT automation, as they will be able to more quickly realize the benefits the solution provides than their smaller counterparts. As stated previously, a very competitive market arena will continue to accelerate the features and functionality provided by vendor products. As the offerings mature, they will become more robust, more intelligent and more cost effective. As a result, the adoption rate will continue to grow, as it would with any technology possessing these traits.

In the remote services space, it is how RDX intends to differentiate ourselves from our competitors. Outsourcing providers that manage large numbers of remote targets will be required to automate, or they will quickly lose market share to those competitors that do. It is RDX’s intention to be an innovator and not a “close follower” of automation technologies.

The post RDX IT Process Automation appeared first on Remote DBA Experts.