Public authorities are regarding the benefits of moving operations to the cloud with a grain of salt.
While security will be a concern no matter what the technology, the primary reason why officials are so apprehensive of cloud computing is a perceived lack of control. A number of organizations already outsource to remote database administration services to secure environments and keep critical solutions operable, but putting an entire infrastructure in the hands of a private company is quite jarring.
Hesitating to advance
InformationWeek contributor Elena Malykhina noted a survey of 153 government IT executives conducted by MeriTalk, which discovered 43 percent of respondents compared transitioning processes to the cloud to giving his or her son keys to a new convertible. The research also acknowledged the following findings:
- Out of nine in 10 agencies, 42 percent are keeping security responsibilities on-remise while 41 percent are obligating cloud vendors to obtain cloud defense certifications.
- Exactly half of respondents are incapable of identifying which departments own certain data.
- More than half (55 percent) believe cloud technology will make it easier for authorities to organize digital information
As one can see, the general sentiment among many federal IT managers is that they lack the expertise and assistance needed to manage their databases and transition those implementations to cloud environments.
The rules of the trade
It's possible for remote DBAs to help allay these concerns. Replicating databases before migration and monitoring environments post-transition are both possible when outside assistance is received. With this in mind, there are several rules Government Computer News advised authorities to keep in mind in regard to cloud technology:
- Employ virtualization beforehand: In short, virtualization optimizes servers by allowing them to run more applications than is conventionally possible. The technology provides the basis of cloud computing.
- It doesn't have to be public: Surrendering all operations to a cloud provider isn't necessary. Private and hybrid environments are both valid options, as they allow organizations to exercise more control over their architectures.
- Look for compliance: One thing many government entities are already doing quite well is searching for cloud providers that satisfy standards defined by the Federal Risk and Authorization Management Program.
- Start with applications: Use Software-as-a-Service deployments before transitioning all IT assets over to the cloud. This will get users and administrators used to the technology.
Cloud computing isn't anything to be fearful of, but that doesn't mean cautionary steps shouldn't be taken. Consulting specialists to diagnose a provider's capabilities will give agencies a clear idea of whether a particular hosting company is the right fit for them.
The post How to transition government databases to the cloud appeared first on Remote DBA Experts.
Welcome back to RDX. A proper test environment should be a regular part of your business' Change Management Process. However, if Personally Identifiable Information (PII) is not removed from the test data, sensitive information could be exposed.
According to eWEEK, Mozilla accidentally exposed critical information in two separate incidents. The most recent was first reported August 27, and left 97,000 developers’ information exposed for approximately 3 months. The landfill.bugzilla.org development system exposed information including email and encrypted passwords. Initial disclosure is thought to have occurred during a database migration with a database dump including user data. Users of this system have been advised to change their passwords.
Mozilla is now revising their test plan to not include database dumps. An additional step businesses can take to protect their PII is to use two-factor authentication for access.
Thanks for watching!
The post Mozilla Working to Enhance its Security Process [VIDEO] appeared first on Remote DBA Experts.
While the retail sector has received much attention lately due to the prevalence of cyberattacks, utilities are also in grave danger of sustaining Web-based aggression.
Database administration services are acknowledging the persistence of Dragonfly, an organization of cybercriminals that focuses on infiltrating the industrial sector. Securing Web-based assets, as well as on-premise architectures, is imperative, as these carefully orchestrated hacks could leave denizens without power indefinitely
A grievous situation
Warwick Ashford, a contributor to Computer Weekly, noted an instance in which he and his team participated in the Kaspersky Industrial Protection Simulation role-playing game. The scenario presented a situation in which a regional water purification plant was infiltrated, leaving thousands of people without drinking water. In addition, the company sustained incredible losses within a five-week period.
The culprit? Dragonfly, a co-op based out of Eastern Europe that security firm Symantec has blamed for hacking more than 1,000 North American and European energy enterprises. Dragonfly specializes in disengaging industrial control systems, which manage electrical, water, gas and oil systems.
Why it's so dangerous
Ashford noted the water treatment company intended to train its IT staff in cybersecurity protocols, but apparently never got around to doing so. After a number of network audits were performed, the company recognized two unidentified Internet connections to the industrial control system. The IT department responded by setting up a number of firewalls.
However, after the security solutions were implemented, Ashford's team still received notifications of unhealthy chlorine levels in its firm's drinking water. Through phishing scams, the perpetrators were able to manipulate the industrial control system's output and filtration of chlorine.
While the aforementioned situation was simply a simulation, that doesn't change the fact that Dragonfly is capable of inflicting such damage. Imagine the harm such a terrorist group could cause in this scenario.
FierceSmartGrid noted one instance in 2011 in which Dragonfly initiated "Energetic Bear," a kind of attack that leveraged two remote access Trojans. These creations form a kind of bridge between cybercriminals and targeted architectures. Basically, they allowed Dragonfly to insert malware into a U.S. utility's databases.
The source noted a statement by Symantec, which maintained Dragonfly's initiatives are focused on "extracting and uploading stolen data, installing further malware onto systems and running executable files on infected computers."
It's this particular breed of cybercriminal that has instigated the need for database active monitoring. Maintaining a high level of thorough surveillance at all times is a must, especially for utilities distributing essential resources to consumers.
- A list of name/value pairs, which is known in other languages as an object
- An organized list of values, also called an array
Why add JSON support to databases?
Unstructured data, a type of information that is ubiquitous in the current Digital Age, needs to be stored in documents, which is exactly how JSON manages data. Many NoSQL databases such as MongoDB, Couchbase and Hadoop abide by this protocol, which has made it a favorite among Web developers, InfoWorld noted.
In order to compete with such architectures, software giant Oracle added a JSON support to the company's Oracle 12c databases, which were outlined at the NoSQL Now conference in San Jose, California last month. This is a break from the conventional relational database management system architecture, but it's presented as an alternative to PostgreSQL, which has been regarded as the open source alternative to Oracle.
Is it a valid option?
Still, DBA services may advise their clients to keep using Oracle 12c for tabular data and conventional NoSQL solutions for semi-structured information. InfoWorld acknowledged how the latter contingency abides by a "scale out" protocol as opposed to a "scale up" approach.
Scaling out enables NoSQL solutions to leverage commodity servers as a way to enhance performance as opposed to bulking up a massive database server. In addition, the way a document-based database allocates information makes companies highly resistant to failure because the data is distributed across multiple servers.
When will the day come?
InfoWorld classified modern databases into three types:
- RDBMS, which handle structured data
- NoSQL, which manage semi-structured information
- Hadoop, which organizes unstructured data
The source proposed an interesting situation, that all three systems be synchronized into a single solution. JSON could potentially provide a structure for just such a database, but it's unknown whether Oracle, IBM or another tech company would be able to successfully develop it (the profits for said enterprise would be huge).
Yet, it's more likely those in the open source community would manufacture a database capable of seamlessly handling structured, semi-structured and unstructured data. Just look at how monumental Hadoop has been.
The post Database manufacturers include JSON in latest provisions appeared first on Remote DBA Experts.
Need to give your databases a boost?
Hi, welcome back to RDX! If your organization's handling large, data-intensive workloads, IBM's DB2 for Linux, Unix and Windows is an attractive alternative.
RDX has worked with DB2 since the beginning, and our DB2 solutions are architected to provide a superior level of DB2 database support. From day-to-day operations to strategic decision making, our DB2 solutions arm customers with the experience, skillsets and best practices required to maximize their critical DB2 environments.
RDX also provides support for IBM’s IMS product set which offers the availability, protection, performance and scalability companies need to process online transactions.
Thanks for watching, and be sure to refer to our company resume for more information on our DB2 and IMS services!
The post Getting the Whole DB2 package, Additional Services Series Pt. 8 [VIDEO] appeared first on Remote DBA Experts.
In regard to big data, the majority of the information produced by enterprises and consumers alike is unstructured, meaning architectures such as MongoDB and Hadoop must be utilized.
This has created a schism between structured and unstructured data. Combining the two requires database administration professionals to query relational and non-relational architectures disparately so that analysis tools can consider the information simultaneously.
Using REST APIs
Representational state transfer application programming interfaces offer somewhat of a solution to this problem. For those who are unfamiliar with the technology, REST and API are two separate infrastructures. According to TechTarget, the former operates over HTTP to read specific websites containing Extensible Markup Language (XML) files. In turn, APIs provide a set of protocols and tools for building software applications.
Therefore, it can be deduced that REST APIs are utilized to develop programs capable of reading information on Web pages, but how does this impact a DBA’s ability to combine and process both unstructured and structured data?
First, it’s important to scrutinize what REST is typically used for. TechTarget noted the architecture is used to provide functionality to social networking channels, mobile applications and business process automation. On the surface level, it would appear skillful database experts would program REST APIs to segregate information the same way Hadoop does and organize them into relational databases, but given the availability of the open source project, doing so isn’t necessarily practical.
A shot of Espresso
One startup in Silicon Valley claims to have found a solution to this conundrum. Database Trends and Applications noted Espresso Logic created a new program that combines big data and SQL information into one cohesive REST API for fabricating the backbone of Web and mobile software.
The REST API provides professionals with real-time access to MongoDB and SQL databases and eliminates the need to replicate data across sources. Security and business logic, two essentials in today’s IT economy, are included as a part of the package.
For analysis’s sake, developers could use Espresso Logic’s solution to aggregate and process information much faster than they could before.
Welcome to RDX. Today, we're going to talk about how we deploy our non-database server monitoring program. What's a non-DB server? The machines that run your enterprise applications, handle network connections and perform other critical functions.
When customizing server monitoring solutions, we follow a five-step process:
One: Understand each customer’s unique monitoring needs
Two: Identify critical monitoring parameters
Three: Develop thresholds, goals and objectives
Four: Develop and document monitoring, remediation and escalation procedures
Five: Monitor and adjust the strategy as necessary
RDX dedicates an entire team who are responsible for creating, implementing and enhancing a strategic blueprint for the proactive monitoring and trouble-shooting methods required to prevent availability and performance problems before they occur.
Companies employing RDX to support their environments do not have to spend the additional monies required to create a proactive monitoring environment.
Thanks for watching!
The post Throw Away Out-of-the-Box Monitoring Solutions, Additional Services Series Pt. 7 [VIDEO] appeared first on Remote DBA Experts.
In order to analyze your data in real-time, you need an architecture that operates at the same pace.
Hi and welcome to RDX! If you're using multiple database systems to store your information, you know how much of a pain it is to combine all that data.
For this reason, RDX offers expertise and support for GoldenGate. GoldenGate provides data capture, replication, transformations and authorization between heterogeneous sources.
This means information stored in PostgreSQL, IBM DB2, and several other systems can be seamlessly aggregated in a single instance. Capturing and delivering data has never been easier!
Thanks for watching! Be sure to join us next time.
The post Access Your Data Sources by Opening the GoldenGate: Additional Services Series Pt. 6 [VIDEO] appeared first on Remote DBA Experts.