Trying to find the right database platform for your company can be an overwhelming task, but it’s still important to put some thought into choosing a specialized database platform to ensure the maximum benefit in the long run.
Faircom Vice President Randal Hoff said in a report on TechRadar.com, that first, you need to understand your business needs – the amount and type of data you’ll be harnessing, storing and analyzing. Also consider whether you have data sources kept in legacy databases and applications, which often have been developed by different coding, making the extraction process difficult.
It’s also very important to think about the future of the company. This alone could dictate whether you need one database platform over another – or the integration of both. Sometimes, companies may require more than one platform to handle different tasks, such as handling customer reviews and comments on one and processing purchase transactions on the other.
Whatever your database platform needs may be, our team of database administration experts can help you worry less and ensure that your database environments are highly available and high performance 24×7.
The post How to Choose the Right Database Platform for Your Company [VIDEO] appeared first on Remote DBA Experts.
Experts have lauded in-memory databases (IMDBs) for their ability to improve data mining and analysis endeavors. Database administration professionals have encountered IMDBs more often now than in recent years, primarily because big data analytics have become so popular with businesses.
Analytics is a priority
In contrast to conventional disk storage mechanisms, IMDBs execute few central processing unit instructions, which drastically cuts down on the amount of time it takes enterprises to run information analysis applications. Mike Matchett, a contributor to TechTarget, noted that traditional transactional databases only held critical intelligence records in memory, while cheaper disks were used to store low-priority volume. However, modern thirst for web-based applications and real-time analytics has made IMDBs a more practical choice.
Today, database experts often find that the majority of structured databases need to be repetitively queried, which can be a slow, arduous process if the interactions occur outside of the server. About 20 years ago, this conundrum wasn't so ubiquitous, primarily because organizations weren't running complex analysis systems. To accommodate these programs, columnar machines were designed to hold optimized storage formats, but still proved inadequate in the face of a high-volume of transactions.
Creating appropriate environments
In order to accommodate these complex analysis solutions, German developer SAP released HANA, a scale-out in-memory database constructed to hold essential enterprise resource planning data. The structure is capable of providing near-real-time business intelligence and can finish reports that may take hours to run over an anachronistic transactional database in seconds.
According to PC World, many database support services are expecting that Oracle is set to unravel a new line of in-memory database products that take storage and processing beyond SAP's initial investments. On June 10, CEO Larry Ellison will host a webcast titled "The Future of the Database Begins Soon," and discuss what the the IMDB means for current and future Oracle customers.
"It will be possible to perform TRUE real-time, ad-hoc, analytic queries on your organization's business data as it exists at that moment and receive the results immediately," wrote Ellison, as quoted by PC World. "Imagine your sales team being able to know the total sales they have made as of right now – not last week, or even last night, but right now."
This level of operability is sure to be a boon for enterprises looking to monitor their progression. It gives executives the abilities to make decisions based on what's occurring and influence outcomes by inciting game-changing factors.
The post For analytics users, in-memory databases may prove useful appeared first on Remote DBA Experts.
We know the whole point of outsourcing your database administration is to save money without sacrificing the quality of your work, so at RDX we use our “Collective Knowledge Approach” – leveraging the 1000s of years of collective experience our team has – to solve your problems faster and more efficiently.
When you outsource your DBA with RDX, you’re not paying for a single DBA resource – you’re buying 24/7 access and support from an entire team of onshore DBA experts – all for less than the cost of one in-house resource.
Your Primary DBA is your go-to contact, knowing your unique needs and support requirements. Now, whenever you have a complex task, your DBA calls on the Subject Matter Expert with the highest skillset to specifically address your need. And behind the scenes, our PMRC team is constantly monitoring your environments to help prevent database problems before they happen.
When you’re outsourcing your DBA support, you want to make sure you get the highest quality support possible. And at RDX, our team has got you covered 24/7.
Database administrators, by the very essence of their job descriptions, are the protectors of their organization’s core data assets. They are tasked with ensuring that key data stores are safeguarded against any type of unauthorized data access. Ensuring that data is protected on a 24 x 7 basis is a complex task. External intrusions and internal employee data thefts combine to make many IT professionals lie awake at night thinking about how they can secure their sensitive database data stores.
Data breaches threaten the survivability of any organization. The financial impact of the breach is not the only issue that affects companies that are victims of unauthorized data access. Bad press, fines, legal costs and loss of customer goodwill must be also factored into the breach’s total impact on the organization.
Significant data breach announcements are publicized on a daily basis. External hackers and rogue employees continuously search for new ways to steal sensitive information. There is one component that is common to most thefts – the ultimate target of that breach is the database data store containing sensitive information.Protecting Critical Data Assets
Hackers can be classified as intelligent, inquisitive, patient, thorough, driven and often, successful. This combination of traits makes data protection a formidable challenge. Those responsible for data protection have a wealth of hardware and software offerings available to them that assist in the monitoring and protection of their organization’s sensitive data stores. Data security teams are tasked with creating protection strategies based on a combination of procedures, controls and product offerings.
These product offerings range the spectrum, from Anti-Virus and Security Information and Event Monitoring software to state-of-the-art firewalls and perimeter monitoring hardware components. The seemingly endless array of hardware and software protection offerings available allows security units to interweave the products and procedures together to craft protection strategies that are custom tailored to their organization’s security needs.Database Activity Monitoring
The ultimate target for many breach attempts is the database. It is widely known to be the container for the organization’s most sensitive data assets. In the past, database administrators have combined product vendor supplied security mechanisms with traditional security products to protect their database environments. Although the major database vendors continue to integrate strong security features into their products’ architectures, third-party offerings in this space have been somewhat limited.
A new breed of products loosely grouped together and defined as “Database Activity Monitoring” (DAM) has recently begun to gain traction with the database administration community. Database Activity Monitoring allows organizations to gain visibility into all database activity including local privileged access and sophisticated attacks from within the database itself. Monitoring helps administrators protect their most valuable and sensitive data from external threats and malicious insiders by alerting them to attacks as well as terminating sessions that violate predefined security policies.
Database Activity Monitoring focuses on the evaluation of the SQL statements accessing the data from a security perspective. Much like a parsing engine that checks the syntax of a SQL statement to ensure it is syntactically correct, the software analyzes the statement and compares it to a predefined set of security rules.
Depending on the vendor utilized to provide Database Activity Monitoring, the combination of rules can be as complex or as simple as needed to protect the database. Most offerings allow administrators to combine security rules that consist of:
- Database account executing statement
- Application program sending statement to database for execution
- Host name, IP address of computer accessing the database
- Time of day
- Type of access being performed (SELECT, UPDATE, INSERT DELETE)
- Data Definition Language statements (CREATE, DROP, ALTER)
- Data Control Language (GRANT, REVOKE)
The software then allows the administrator to take preventative action for statements that violate the predefined security policies. Depending on the product used, preventative action could include logging the statement, generating alerts or blocking the statement from execution. Some vendor offerings also allow the administrators to quarantine the user, which prevents them from issuing any additional statements that access the database.
The major offerings also provide out-of-the-box compliance and vulnerability reports. The Database Activity Monitoring software has preprogrammed rules for most major regulatory requirements including SOX, HIPAA and PCI. Vulnerability scans allow administrators to quickly identify potential issues and take the steps necessary to remediate or mitigate their impact.
The general vulnerability scans test for weak passwords, default accounts and other breaches of best practices that may result in a vulnerable database. Scans can be scheduled to run on a regular basis to ensure that no new vulnerabilities are identified.Remote DBA Experts’ (RDX) Database Activity Monitoring Service
As a remote database services provider, RDX understands the important role security plays in our customers’ environments. Our customers have entrusted our organization to help safeguard their critical data stores.
RDX completed an extensive evaluation of various database protection products and has selected to partner with McAfee Security to provide a database security monitoring offering to our customers. A thorough evaluation of a Database Activity Monitoring vendor is critical to any shop desiring to implement Database Activity Monitoring. RDX performed a traditional vendor analysis that assigned weights to evaluation criteria based on their importance to our organization. Because RDX supports a wide range of database products, the product was required to support as many different vendor offerings as possible. Some of the additional criteria RDX used in its evaluation follows:
- Track record as a proven security software provider and presence in market place
- Wide range of SQL statement security monitoring options (type of access, user, time-of-day, etc.)
- Limit the performance impact on the database, transactions and individual SQL statements being analyzed
- Ability to quarantine users, not just block their statement from executing
- Strong vulnerability scanner option that included predefined rules for SOX, HIPAA and PCI
- Robust reporting and alerting features
- Internal security mechanisms that prohibit product tampering and unauthorized access to the Database Activity Monitoring data store
- The ability to virtually patch systems to quickly protect databases from new exploits until the database vendor supplied hard patches become available
- Ease-of-installation and debugging
After the tool is purchased, an implementation project is then initiated to tailor the product to meet the organization’s Database Activity Monitoring needs. The implementation can be decomposed into the following tasks: educating the application owners and security teams on the features that the Database Activity Monitoring product provides, needs analysis, customizing the product, defining alert notification and escalation procedures, initial vulnerability scans and ongoing monitoring. RDX’s implementation consists of the following phases:Customizing the Security Offering
RDX works with each customer to design a security service offering that meets their unique security monitoring needs:
- Detailed discussions are held during integration on the type of data to be protected as well as the customer’s current internal security practices and product sets. The intent is to allow RDX to fully understand each customer’s overall security requirements and current controls.
- RDX’s security services are discussed in-depth with each customer to ensure they fully understand RDX’s data protection strategies.
- RDX works with customers to agree upon the installation and configuration of the security monitoring architecture.
- RDX identifies critical customer software products being utilized to create weekly vulnerability and new threat notifications.
- Criteria used to generate alert event notifications is agreed upon.
- Alert event notification and escalation procedures are finalized.
- Initial and ongoing vulnerability scans are scheduled.
The initial vulnerability scan provides a detailed security analysis of the database being protected. Its intent is to identify current vulnerabilities to allow RDX and the customer to quickly address them.
- The scanning process automatically finds databases on the network, determines if the latest patches have been applied, and tests for weak passwords, default accounts, and other common threats — making it easier to demonstrate compliance to auditors and improve protection of critical data assets. The scanner conducts a check of more than 3,500 potential database vulnerabilities.
- Scanning templates for PCI DSS, SOX, HIPAA and other regulations can be utilized, as well as specific reports for various stakeholders such as DBAs, developers and security teams.
- The scanning output provides remediation recommendations for most high-priority vulnerabilities.
- RDX’s database and operating systems experts will present the scanning results to customers and provide valuable insight into the security flaw, the type of exposure or vulnerability it presents. RDX works with each customer to mitigate or completely resolve the vulnerability issue.
- RDX will then schedule the scanner to run on a quarterly basis to ensure that no new vulnerabilities are identified
24×7 Database Activity Monitoring is the heart of any Database Activity Monitoring implementation
- A small client-side monitoring sensor is installed on each monitored target.
- The sensors send alert notifications to a central console installed at the customer site.
- RDX works with each customer to create security event identification criteria to alert or terminate sessions based on, but not limited to:
- Application program
- IP or host name
- Time of day
- Type of statement (SELECT, INSERT, DELETE, TRUNCATE, UPDATE etc..)
- The alert notifications are forwarded to RDX’s Proactive Monitoring and Response Center (PMRC) for processing.
- RDX’s PMRC Team customizes the alert event notification and escalation procedures to meet the customer’s security needs.
- RDX will configure self-defense mechanisms to detect product tampering and trigger tampering alerts.
- The PMRC team constantly reviews alerts generated during initial implementation to reduce alerting “noise”. Discussions are held with customer to reduce unwarranted notifications.
- RDX ensures that all monitoring components are functioning as expected and performs system upgrades to ensure that all new features are quickly leveraged by each customer.
The intent is to provide customers with an ongoing service offering that focuses on the protection of sensitive database data stores:
- Security Analysis Services
- Configure custom Database Activity Monitoring Reports
- Work with customer to add or change database specific alert event notification criteria
- New Threat Analysis – Identify customer IT product sets during integration and create new vulnerability notifications that may affect those products
- DBAs and OS engineers will provide detailed insight to alert notifications when requested
- Scan for new databases added to network
- Perform quarterly vulnerability scans to ensure that recent changes have not increased exposure to security threats
- Database and OS patching services
- Upgrade database and operating system to new release
- Apply database and operating system security patches
- Provide advice to customer personnel on patching and upgrades
Protecting against unauthorized data access must be an ongoing process. RDX is acutely aware that those individuals desiring to exploit data stored in computer systems for personal means are constantly changing and improving their data access strategies. The fluid nature of external and internal attacks requires DBA units to leverage time-tested database security best practices but also actively seek out and implement hardware and software products that can be interwoven into an effective database protection strategy. One of those key tools in your DBA toolbox is the database activity monitor. If you have sensitive data, you can choose to implement the product on your own or take advantage of our services.
Regulations dictated by the Affordable Care Act and the Center for Medicare and Medicaid Services' (CMS) Meaningful Use laws will likely require health care organizations to seek database support services. Stage 2 of the latter legislature requires hospitals, clinics, insurance companies and other entities partaking in the medical industry to adopt electronic health records, so that it's easier for professionals to share patient information – the hope being better, cheaper care for individuals.
Abiding by mandates
This massive transition toward EHRs has prompted health care companies to take two factors into consideration. First, in order to remain complaint with the Health Insurance Portability and Accountability Act (HIPAA), robust security measures must be applied to data storage, servers and facility networks. Second, professionals must find a way to manage a system in which a doctor can access, create and edit information from multiple devices. For these reasons, many have regarded IT outsourcing as a viable option, turning to database administration professionals to oversee system management
Handling such operations in-house often comes across as a challenge for companies that have focused more on patient care than developing computing capabilities, but it's what the federal government mandates. According to iHealthBeat, the CMS recently published a notice in the Federal Register that it plans to assess the readiness of organizations looking to adhere to meaningful use laws. A database containing reports on each entity obligated to abide by the CMS' mandates will be launched after reviews are completed.
Diving into big data analytics
The CMS' laws provide the health care industry with a unique opportunity to capitalize on the benefits associated with information analysis. InformationWeek noted that metrics can help organizations scrutinize costs, assemble faster diagnoses, adjust managerial practices, improve workflow and so much more. To get the most out of such technology, data could be collected from EHRs, wearable devices such as FitBits and numerous other sources.
"We, as a society, need to start creating our own metrics for how health care quality is defined," Dr. Anil Jain, senior vice president and chief medical officer at Explorys told InformationWeek in an interview.
Ten years ago, integrating analytics into hospital practices was largely impractical. However, the advent of cloud computing, outsourced DBA services and greater demand for the technology have driven down costs. It's time that the health care industry benefits from it.
The post Health care regulations may boost need for DBA services appeared first on Remote DBA Experts.
In a recent move, Virtual-Strategy.com reports that Boston Medical Center has upgraded its database environment and moved it onto Oracle Database Appliance to improve efficiency and performance. The academic medical center affiliated with Boston University aims to fully support, scale and optimize its clinical and operational data environment, Virtual Strategy reports.
By embracing a more modern engineering solution, BMC will be able to save both time and money, reducing annual storage costs and management time with multiple vendors. Oracle Database Appliance has already provided the medical center with greater CPU and RAM usage and has allowed BMC to consolidate its data management team – meaning engineers can focus on other high-priority projects.
Solutions like Oracle Database Appliance can help hospitals modernize their databases to achieve faster, more insightful analytics on the quality of their treatments and care.
The post Boston Medical Center uses Oracle for higher performance [VIDEO] appeared first on Remote DBA Experts.
There has been tons of talk since the White House Big Data Report came out last week on the implications of big data in the healthcare and education industries, personal security and other important privacy issues.
Among these, de-identification and perfect personalization stand out as things to watch for, as technology companies become increasingly more capable of linking device users with personal information. The question in hand here is how people will be able to control their online identities. Even if organizations using privacy protection technology de-link “anonymous” data from a specific person or device, advancing technologies are making it easy for this information to be re-linked.
Perfect personalization, which allows marketers to deliver exactly the right message to customers, could also potentially leave room for discrimination based on ethnic names or cultural preferences.
The important thing is to be aware, on both a personal and consumer level, of the power and reach big data continues to amass.
The post White House Big Data Report Raises Privacy Questions [VIDEO] appeared first on Remote DBA Experts.
In response to the burgeoning cloud market, Oracle has rolled out a number of applications designed to help companies streamline migration. A number of businesses interested in capitalizing on the benefits associated with the cloud are turning to database administration services to assist them. Yet, there are a few preliminary steps these enterprises should take before submitting to cloud fever.
Get your licensing in order
SiliconANGLE recently attended May's EMC World 2014 event at the Sands Convention Center in Las Vegas, where Principal Engineer for EMC Bart Sjerps discussed why and how Oracle customers are virtualizing their databases. Many are choosing to put these critical production servers in environments such as VMWeb, which allows users to solve any issues regarding support and certification. Deployment strategies aside, he claimed that companies should consider the legalities of the situation before proceeding.
"Oracle is very keen on licensing," he told EMC World attendees, as quoted by the source. "Make sure that your license are in place, that you're compliant with your Oracle policy … if you're not in compliance, then you might have some severe issues later."
In order to get a better perception of what these bylaws entail, it may be favorable for businesses to hire Oracle experts to provide them with special insight.
One of the key reasons why corporations consider shifting to a cloud solution in the first place is due to a promise of greater operability. Robin Birtstone, a contributor to The Register, recommended establishing robust baselines for existing performance models and figuring out why operations not where decision-makers would like them to be. It's important to determine what kind of processes a company would like to benefit from once the transition to the cloud is complete.
Analyze the applications
The next step is figuring out how all of the applications, query loads and central processing units will interact with the new environment. Considering the latter element, Sjerps stated that database administration professionals often find that traditional, physically deployed servers possess very low CPU utilization. Keeping these machines running and managing them in a tangible format can be quite expensive when costs are factored into the equation.
"One of the things you can do with virtualization is basically glue all those CPU resources together," said Sjerps.
Sjerps noted that such a technique will enable remote DBA personnel to move workloads and set CPU shares so that companies can drive utilization. If the aforementioned considerations are taken into account, a business will be ready for cloud migration.
Enterprises choosing to remain with legacy data centers won't be able to take advantage of the Internet of Things. Database administration professionals would agree that the large amount of information being generated by Web-connected devices, ranging from smartphones to automobiles, cannot be contained or properly analyzed in a traditional system.
According to CIO, experts at EMC have stated that cloud computing offers solace to those aggregating large troves of intelligence. The scalable storage volume offered by the technology allows IT departments to collect diverse data sets and run analysis programs to translate the ones and zeroes into actionable information.
EMC executives assembled at the company's world conference in Las Vegas earlier in May, claiming that the "2nd platform," which consists of client-server technology, will be superseded by a "3rd platform" of computing spearheaded by big data, cloud deployments, mobile devices and social networking. EMC has combined its storage business with VMware, RSA Security and its own Pivotal division to help businesses make the transition.
EMC President Jeremy Burton referred to this cohesion as a "federation," claiming that the face of IT is changing rapidly to adapt to new data collection techniques. What it foreshadows is an environment of outsourcing, in which businesses hire entities providing database support services and other forms of IT backing.
Can they defend the hub?
It seems that whenever cloud computing is mentioned in a conversation, ruminations regarding protection are expected to transpire. Though any IT department is bound to be apprehensive of transition to an environment that bypasses traditional platforms, it should be noted that how secure a cloud server is largely depends on the priorities of the company using it.
ITProPortal referenced a survey of 250 enterprises conducted by SkyHigh, which found that 3,571 different kinds of cloud services were being used among the study's participants. Of those products, a mere 16 percent possessed multi-factor authentication and 11 percent had data encryption features. Nearly 30 percent of respondents encountered hacking attempts from outsiders.
From what can be gathered from this study, it's not cloud computing that poses a threat, but the priorities set by the businesses leveraging it. Though database administrator services can provide enterprises with adequate protection, benefiting from their support depends on whether or not companies are willing to outsource to them. In addition, concern regarding cloud defense has spawned a new branch of IT that specializes in cloud security techniques – another option enterprises could capitalize on.
I support a system that uses third-party software. After a recent application upgrade, I began receiving sporadic 8623 errors. These began just once every few days and quickly escalated to 3-4 per day. The error I was receiving was 8623, Severity 16, State 1.
DESCRIPTION: The query processor ran out of internal resources and could not produce a query plan. This is a rare event and only expected for extremely complex queries or queries that reference a very large number of tables or partitions. Please simplify the query. If you believe you have received this message in error, contact Customer Support Services for more information.
I understood that this was a complex query but struggled with finding the source. I was using SQL 2008 R2 so the extended events available in SQL 2012 were not an option for debugging. (If you are on this version, I recommend this article by Jason Brimhall: http://jasonbrimhall.info/2014/01/02/day-9-queries-going-boom/)
For me, the only option I had was to try to capture this in a trace. The error was occurring multiple times per day but not at consistent times. I setup a server-side trace for SQL and waited for the error to occur.
I won’t go into a lot of detail on server-side traces as there are lots of examples of this on various websites.
I created my profiler trace and exported the definition. In SSMS, I made minor changes to the code before starting the trace. I added a filecount parameter (50) and a stoptime parameter (8 hours in the future). This would give me 50 rolling trace files to review and would keep the trace from consuming excessive space on my server.
I then changed the sp_trace_create to include these new parameters.
I executed the code to start the trace and waited..and waited…and waited. Once the error occurred, I stopped the trace so I wouldn’t lose the data I had collected.
I first found my TraceID:
and stopped the trace:
I then queried the trace files looking for the error:
This showed me the error but not much more information (hostname, loginname, database name were removed for security purposes). I looked at the surrounding rows, before and after, and was still not able to find the issue. Since I had the SPID though, I was able to search through all the trace files for that SPID:
I found the row shown above and pulled the TextData. This query ran for almost 2 minutes and ended just after the 8623 error. In this query, I was able to see the issue quickly in the WHERE clause.
At this point, I had found the source of the error and had code to send to the vendor.
For my final cleanup, I closed the trace and manually deleted the .trc files:
SQL doesn’t provide an easy way to identify these rogue queries, but the data is there if we know how to extract it.
Thank you for reading. Feel free to leave a comment below, and stay tuned for my next post in the coming weeks.
The term ‘big data’, which describes the massive collection of data sets from traditional and non-traditional sources used for market discovery, analysis, trending and forecasting, seems to be the buzzword these days.
Consumers unknowingly contribute to 'big data' when they shop on their iPhones, or buy a plane ticket online. Businesses track these transactions and interactions and use it for analysis and mining to learn more about their audiences and to create customer-centric strategies involving spending/sales trends, interests, popularity and rewards programs to name a few. Even the UN has urged for the use of big data in pushing forward global development goals.
For businesses, big data should be maximized and used to its full potential. But this requires more than just great analytics – it requires strategy, security, and platform management.
A recent article in Forbes.com recommends that businesses create a corporate culture that understands, inside and out, the big data their company is obtaining, making sure security is embedded into all daily processes, and investing in a platform that will allow them to use their big data to make better business decisions and to compete in their business space.
CIOs throughout the United States often outsource to remote database support services to provide their companies with mobile device management systems. However, getting employees to adhere to security protocols is a challenge in and of itself. This difficulty is causing executives to wonder whether or not they should reward their workers for participating or reprimand them for failing to do so.
Popular, but risky
The Oklahoman reported that even computers disconnected from the Internet are at risk of obtaining a virus. Davis Merrey, owner and chief executive of TeamLogic IT, informed the source that the church he regularly attends in Oklahoma City had to rebuild its PC after someone uploaded corrupted files from a mobile device onto the machine.
Although the idea of accessing work-related documents from anywhere at any time is appealing to both business leaders and their subordinates, adhering to security policies and regulating access remains a challenge. As a result, database administration experts have recommended that enterprises distribute their own smartphones or tablets – which are typically regulated by DBA professionals – to employees. However, such a process can be quite expensive for businesses with a limited budget.
Some organizations have simply chosen to ban bring-your-own-device practices altogether. An anonymous mechanic working at Tinker Air Force Base claimed that the government's concern regarding cyber warfare has caused officers to prohibit airmen and employees from plugging electronic devices into the base's computers.
To tempt or punish?
Tom Kaneshige, a contributor to CIO, noted that companies still want to benefit from the cost savings and enhanced operability associated with BYOD. In order to do so, executives are realizing that they need to make those working beneath them care more about database security. Kaneshige referenced a survey of 500 employees conducted by Centrify, which showed that 43 percent have accessed sensitive corporate information while connected to an unsecured public network.
Due to this apparent apathy, some CIOs are outsourcing to database experts who can erase unauthorized applications and data held on smartphones or tablets or prevent them from obtaining enterprise data via certain connections. However, this process contradicts the reasons why companies allow their employees to participate in BYOD in the first place.
Josh Bouk, vice president of sales and marketing at Cass, told Kaneshige that his company offers its workers a monthly stipend in exchange for enrolling in Cass's BYOD policies. Employees undergo eligibility screening and then are granted permission to access the organization's databases.
A couple weeks ago, my colleague, Brandi Dollar, wrote a blog post about SQL Server transaction log basics. Her post is a great lead-in to a script that I wrote to solve a common problem, high VLF counts. An important piece of managing your database transaction logs is keeping the number of virtual partitions within the log file, the Virtual Log Files (VLF for short), low. A high VLF count is typically a result of running with the default auto grow settings. As the transaction file continues to grow at sub-optimal levels, the fragmentation will become worse and worse. High VLF counts can have an impact on several performance issues.Fixing a database with a high VLF count is a simple process:
- Check the current size of the transaction log.
- Backup the transaction log.
- Shrink the transaction log to as close to 0 KB as possible.
- Check that the VLF count is less than 50 (best if less than 10 at this point).
a. If the VLF count is greater than 50, repeat steps 1 through 3.
b. You may also have to check that there are no active transactions, etc. The log_reuse_wait_desc column in sys.databases will help identify why the log file cannot be shrunk.
- Grow the transaction log back out to the original size
When I log into a new server environment for a client, it is pretty common to see multiple databases that have high VLF counts. The steps above are easy enough to complete when you have one or two databases that need fixed, but it becomes a very tedious process when there are 20, 30, 40 databases that all need fixed.
To save a lot of the leg work involved, I wrote a script to generate all of the necessary commands. When you run the script below, it will print out all of the commands necessary to reduce the VLF count. Take the output from the messages tab, and you can run this result in phases in a new query window. I probably could have taken the script step a step further to completely automate the process; however, I still like to have control over the process and monitor the completion. Also, rather than writing my own transaction log backups, I will typically run an existing maintenance plan job as needed.
The output of the scripts below will come out similar to:
/**************************************** * Fix High VLF for database: MattWorks * Starting VLF count: 219 ****************************************/ -- Step 1: Get current log file size use [MattWorks] SELECT name AS [File Name] , physical_name AS [Physical Name], size/128.0 AS [Total Size in MB],size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int)/128.0 AS [Available Space In MB], type_desc FROM sys.database_files; -- Step 2: Take transaction log backup if needed, and then shrink the log file DBCC SHRINKFILE (N'MattWorks_log' , 0, TRUNCATEONLY); -- Step 3: Check if current VLF count is less than 50. -- If not, take another log backup and repeat step #2 until VLF count is less than 50. check for active transactions, etc. DBCC LogInfo; --Step 4: Grow log file back to original size USE [master]; GO ALTER DATABASE [MattWorks] MODIFY FILE (NAME = N'MattWorks_log', SIZE = 1024MB); GO
I hope that this script can be of use to someone.
/************************************** * Script to generate commands to reduce transaction logs * with hig VLF counts. * written by Matt Nelson, RDX ***************************************/ SET NOCOUNT ON --variables to hold each iteration declare @query varchar(100) declare @dbname sysname declare @vlfs int --table variable used to loop over databases declare @databases table (dbname sysname) insert into @databases --only choose online databases select name from sys.databases where state = 0 --table variable to hold results declare @vlfcounts table (dbname sysname, vlfcount int) --table variable to capture DBCC loginfo output declare @dbccloginfo table ( --resourceid int, --need to add this column for SQL 2012 fileid tinyint, file_size bigint, start_offset bigint, fseqno int, [status] tinyint, parity tinyint, create_lsn numeric(25,0) ) while exists(select top 1 dbname from @databases) begin set @dbname = (select top 1 dbname from @databases) set @query = 'dbcc loginfo (' + '''' + @dbname + ''') WITH NO_INFOMSGS' insert into @dbccloginfo exec (@query) set @vlfs = @@rowcount insert @vlfcounts values(@dbname, @vlfs) delete from @databases where dbname = @dbname end /* Build data file info */ CREATE TABLE #DatabaseFiles ( [database_name] [sysname] NOT NULL , [file_id] [int] NOT NULL , [file_guid] [uniqueidentifier] NULL , [type] [tinyint] NOT NULL , [type_desc] [nvarchar](60) NULL , [data_space_id] [int] NOT NULL , [name] [sysname] NOT NULL , [physical_name] [nvarchar](260) NOT NULL , [state] [tinyint] NULL , [state_desc] [nvarchar](60) NULL , [size] [int] NOT NULL , [max_size] [int] NOT NULL , [growth] [int] NOT NULL , [is_media_read_only] [bit] NOT NULL , [is_read_only] [bit] NOT NULL , [is_sparse] [bit] NOT NULL , [is_percent_growth] [bit] NOT NULL , [is_name_reserved] [bit] NOT NULL , [create_lsn] [numeric](25, 0) NULL , [drop_lsn] [numeric](25, 0) NULL , [read_only_lsn] [numeric](25, 0) NULL , [read_write_lsn] [numeric](25, 0) NULL , [differential_base_lsn] [numeric](25, 0) NULL , [differential_base_guid] [uniqueidentifier] NULL , [differential_base_time] [datetime] NULL , [redo_start_lsn] [numeric](25, 0) NULL , [redo_start_fork_guid] [uniqueidentifier] NULL , [redo_target_lsn] [numeric](25, 0) NULL , [redo_target_fork_guid] [uniqueidentifier] NULL , [backup_lsn] [numeric](25, 0) NULL ) EXEC dbo.sp_MSforeachdb 'INSERT INTO #DatabaseFiles SELECT ''[?]'' AS database_name, * FROM [?].sys.database_files' print '' print '/*********************************************************************************************' print 'Copy results below this line to new query window to execute as needed.' print '*********************************************************************************************/' DECLARE @databasename varchar(max) DECLARE @vlfcount varchar(max) DECLARE @filename varchar(max) DECLARE @physicalname varchar(max) DECLARE @totalsize varchar(max) DECLARE @availablespace varchar(max) DECLARE @typedesc varchar(max) DECLARE dbcrsr CURSOR FOR SELECT dbname, vlfcount FROM @vlfcounts WHERE vlfcount >50 --comment this line to print statements for all databases. AND dbname not in ('tempdb','master','msdb','model','reportservertempdb') ORDER BY dbname OPEN dbcrsr FETCH NEXT FROM dbcrsr INTO @databasename,@vlfcount WHILE @@fetch_status = 0 BEGIN DECLARE filecrsr CURSOR FOR SELECT name, size/128 From #DatabaseFiles Where database_name = '['+@databasename + ']' and type_desc='LOG' OPEN filecrsr FETCH NEXT FROM filecrsr INTO @physicalname, @totalsize --Generate print statements to execute in new window WHILE @@fetch_status = 0 BEGIN if @totalsize=0 SET @totalsize=1 print char(10) + '/****************************************' print '* Fix High VLF for database: ' + @databasename print '* Starting VLF count: ' + @vlfcount print '****************************************/' print '-- Step 1: Get current log file size' PRINT 'use [' + @databasename + ']' print 'SELECT name AS [File Name] , physical_name AS [Physical Name], size/128.0 AS [Total Size in MB],size/128.0 - CAST(FILEPROPERTY(name, ''SpaceUsed'') AS int)/128.0 AS [Available Space In MB], type_desc ' print 'FROM sys.database_files;' print char(10) + '-- Step 2: Take transaction log backup if needed, and then shrink the log file' + char(10) + 'DBCC SHRINKFILE (N''' + @physicalname + ''' , 0, TRUNCATEONLY);' print char(10) +'-- Step 3: Check if current VLF count is less than 50.' print '-- If not, take another log backup and repeat step #2 until VLF count is less than 50. check for active transactions, etc.' print 'DBCC LogInfo;' print char(10)+'--Step 4: Grow log file back to original size' print 'USE [master];' + char(10) + 'GO' print 'ALTER DATABASE [' + @databasename + '] MODIFY FILE (NAME = N''' + @physicalname + ''', SIZE = ' + @totalsize + 'MB);' + char(10) + 'GO' print +char(10)+char(10)+char(10) FETCH NEXT FROM filecrsr INTO @physicalname, @totalsize END CLOSE filecrsr DEALLOCATE filecrsr FETCH NEXT FROM dbcrsr INTO @databasename,@vlfcount END CLOSE dbcrsr DEALLOCATE dbcrsr DROP TABLE #DatabaseFiles
Thank you for reading! Stay tuned for my next post in the coming weeks.
Though it generally falls under the umbrella of the IT industry, cloud computing could be considered to be in a league of its own. Database experts are beginning to favor the scalable, flexible deployment option, as it allows them to efficiently accumulate and manage digital intelligence. IT companies are recognizing the popularity of this trend, and in turn helping their clients prepare for or execute cloud migration endeavors.
According to Chris Kanaracus, a contributor to Good Gear Guide, Oracle recently announced that it will allow its customers to transition their subscription plans for on-premise software programs to the company's Software-as-a-Service applications. The migration plan, appropriately named Customer 2 Cloud, enables consumers currently using Oracle's in-house human capital management and customer relationship management tools to switch over to cloud versions.
Many Oracle experts have stated that the business move is likely in response to increased pressure from competitors such as Workday and Salesforce.com. While the latter organization provides its customers with cloud-based CRM systems, the former constructs HCM and enterprise resource planning solutions that operate through SaaS models. Ray Wang, founder and chairman of Constellation Research, noted that many enterprises desire such cloud applications, but are often unsure of how to make the transition.
"Many customers need a defined methodology to go from on-premises to cloud," said Wang, as quoted by Kanaracus.
Oracle: a cloud game changer?
Oracle's Customer 2 Cloud program isn't the only solution the company has recently released that concerns cloud computing. In April, the corporation also released Solaris 11.2, a Unix-based operating system that provides businesses with a viable cloud management suite. ZDNet contributor Larry Dignan acknowledged Oracle CEO Mark Hurd's ambitious goal to spearhead new solutions and applications that will solidify the company's place as the leader of everything cloud. While such an endeavor may come off as a tall order, database administration service organizations have taken note of the latest Oracle projects.
For example, Dignan wrote that Oracle's Marketing Cloud can break down data channels and correlate them with IT and marketing features that can help businesses appeal to customers who exercise some kind of sway over other consumers. The program includes Oracle BlueKai, Oracle Content Marketing, Oracle Eloqua, Oracle Responsys and Oracle Social Cloud, ensuring that an integrated, concise solution will be at the disposal of CMOs throughout the world.
From what can be gathered of these recent initiatives, it's apparent that Oracle intends to assert itself as a serious competitor in the cloud application sector.
There are two IT aspects enterprises are looking for in regard to database administration: performance and data analytics. Aggregating large troves of digital information is becoming less of a challenge now than it was four years ago. Now, companies are more concerned with translating the data into actionable insight that can drive business transformation and agility.
Giving it a boost
One particular program that database support services have favored is Microsoft's SQL Server 2014, which has been lauded for its data analytics properties and processing capabilities. EnterpriseTech reported that storage and application performance enhancement developer Fusion-io is leveraging flash functions to accelerate the data warehouse management solution's core functions. Fusion-io President Lance Smith noted that SQL Server 2014 added two new ways in which Flash can be used to enhance database effectiveness.
- Use Flash to accelerate the old server engine using Buffer Pool Extension, a SQL Server 2014 feature that allows files within databases to be retrieved faster, without using many resources.
- Leverage flash to keep Hekaton (the name given to Microsoft's in-memory online transaction processing database) running better in tandem with the tables it stores and processes inside of main memory.
In a nutshell, Fusion-io's Flash utilization reduces the amount of sequential disk input/output and increases transaction log activity – making it easier to run complex analytics programs.
A platform for scrutiny
Nowadays, database experts realize that making heads or tails of the digital information at an enterprise's disposal involves executing effective information analysis programs. According to CIO Today, SQL Server 2014 is capable of supporting an environment that experiences heavy data scrutiny traffic.
Microsoft CEO Satya Nadella noted that the corporation sought to construct a platform possessing "ambient intelligence," or, an atmosphere that is both sensitive and responsive to an individual's presence. As can be imagined, creating such a system was no easy task.
"It requires technology to understand context, derive intent and separate signal from noise," said Nadella, as quoted by the source. "Building out a comprehensive platform that can enable this kind of ambient intelligence is a whole company initiative that we are uniquely qualified to undertake."
Ultimately, the platform, which as been described as "big data in a box" by some professionals, includes Internet of Things capabilities and delivers the building blocks SQL Server 2014 users require to connect, refine and analyze their data. It seeks to provide answers to those who have collected an incredible amount of digital information from various sources.
The United States economy has witnessed a general upswing since March. After the end of a harsh winter that slowed market progression, people are getting back to work and consumers are hitting the stores once again.
Increased fiscal activity means new expansion opportunities for enterprises. In the modern era, business growth is typically associated with enhancing big data capabilities. In a highly technical atmosphere, executives are considering the benefits associated with outsourcing their IT responsibilities to remote database support companies.
According to Forbes, The Bureau of Labor Statistics recently released its April employment report, which showed that 288,000 jobs were added in April – 70,000 more than expert economists initially predicted. The unemployment rate dropped from 6.7 percent to 6.3 percent, the lowest rate since before the onset of the 2008 financial crises.
However, statistical representation can often be misleading. Dan North, chief economist at Euler Hermes North America, told the source that the unemployment rate fell so much over a short period of time because more people left the United States. Nevertheless, North noted that the feeling of optimism hasn't dampened corporate earnings and that the marketplace is growing faster than anticipated.
Where has all the labor gone?
While the Federal Reserve tries to cope a quickly growing economy, database administration companies are looking for human assets to fill in key positions. Nonprofit organization CompTIA recently surveyed 305 IT firms across the United States for its quarterly IT Industry Business Confidence Index. Despite the fact that these enterprises expressed the desire to expand, many of them are having difficulty finding qualified personnel to fill new positions.
The index showed that 48 percent of the job opportunities offered by respondents consist of general support or service duties. Application development positions stood at 39 percent and cloud experts rang in at 33 percent. In addition, 29 percent of the open positions were for network engineers and 28 percent presented opportunities for professionals with security expertise.
Despite the lack of labor that seems to be plaguing the industry, it takes CIOs considerably less time to fill these positions (23 business days) than it does in other sectors. However, many IT professionals surveying candidates are quite picky about their choices – about 80 percent of respondents found recruiting for technical hires "challenging."
What this creates is an incredibly stressful work environment for database experts who are already on staff. It's not uncommon for those working for a managed IT services company to be on call constantly, sometimes putting in 70 hours in a work week. CompTIA's evidence clearly points to a need for more education in computer sciences.
Public sector officials and experts alike have largely celebrated the potential innovation and efficiency-driving power of the massive transition into the digital era, as data management outperforms traditional information governance in a variety of ways. In the health care sector, many organizations are still working to establish a stronger foothold in modernized data management practices because of the novelty of most systems.
Health Data Management recently listed several reasons why many businesses in this sector have issues with maintaining the best practices of data encryption, specifically regarding mobile device utilization. Citing security breaches that have taken place and impacted millions of individuals throughout the past several years, the news provider pointed out that health care providers are largely overlooking the necessary actions that must be taken to protect patient data that is being accessed by smartphones and tablets.
Encryption, when managed and monitored properly, can significantly improve the level of control an organization has over its data, regardless of whether it is in a storage environment or in transit at any given time. According to the source, the most common reasons behind a widespread lack of device, application and data encryption in this sector continue to be related to in-house expertise shortages and a lack of leadership acumen.
Like many other sectors in the modern market, the health care industry has had to tackle countless trends that have emerged in a relatively short period of time, leading to extraneous complexity and a generally uphill battle.
However, support is available for all types of organizations, especially when it comes to data management and security procedures. The trick is to know when those assistive frameworks are necessary, and where to look during the provisioning process. By proactively identifying data management needs, be it for security, accessibility or disaster recovery-related purposes, medical firms will be able to remain ahead of the curve.
Oracle experts working for companies using singular or multiple cloud platforms are anticipating a new update released by the database solutions developer. Cloud computing has grown more popular with time, making it only appropriate that one of the world's most renowned IT enterprises would develop a product that makes working with cloud servers much easier.
According to InfoWorld, Oracle recently introduced a Solaris update that includes a share of the OpenStack cloud management solution. It is believed that the system will make it easier for Oracle customers to operate mixed cloud environments. Released as a public beta on April 29, Solaris 11.2 possesses OpenStack components such as the Horizon dashboard, Nova compute module, Neutron network platform and Cinder block storage structure.
For those who aren't aware of what OpenStack brings to the table, the news source explained that the solution can be used to implement on-demand compute services on virtualized equipment as well as private cloud deployments.
Moving toward an integrated solution
Cliff Saran, a contributor to Computer Weekly, stated that Solaris 11.2 operating off of Oracle's aggressively-priced Sparc servers makes for a more cost-effective solution, though it's debatable whether or not the combination will surpass Linux-based x86 servers held in a data center environment. Database administration professionals have claimed that Oracle's recent move is likely a push for an integrated system.
"It is obvious that Oracle has delivered on its commitments regarding Sparc and is continuing its investments in Sparc CPU and system design as well as its Solaris operating system technology," said Richard Fichera, research director for Forrester, as quoted by the source.
In order to remain competitive with Linux Servers, Fichera further noted that the new Sparc deployments will feature a multi-year, reliable performance progression and system scalability.
Data infiltration attempts are becoming just as much a part of the criminal underworld's practices as money laundering, drug trafficking and extortion.
Though it may be strange to compare a computer nerd with a notorious mobster, many database experts would suggest that the damage a well-rounded cybercriminal can cause exceeds the capabilities of a ruthless gang confined to the streets of New York City. Once an individual harboring ill intentions gains access to a company's servers, they can potentially steal thousands of data points representing business finances and consumer information.
Understanding what they're up against
Paul Kleinschnitz, general manager of First Data's cybersecurity solutions department, told Bank Info Security that many organizations lack a full comprehension of what they're up against. Although stealing intelligence through a computer seems more benign than pointing a gun at a person's head to obtain it, that doesn't make cybercriminals any less relentless or malevolent. In addition, the reach of such figures is much more expansive than traditional underworlds.
"One of the inhibitors to adoption in this space, in my opinion, has been a lack of knowledge of how these technologies work together, rather than compete," Kleinschnitz told the source.
Essentially, a comprehensive understanding of which tactics cybercriminals employ will give companies a better perception of how to deter attacks. Some executives hire database administration services well versed in data infiltration techniques to provide oversight of the criminal element's practices.
Employing tactics of their own
After comprehension of the environment working against them has been established, corporations should begin implementing cybersecurity practices and techniques that will deter infiltration attempts by hackers. Jonathan Hassell, a contributor to CIO, claimed that such attacks are going to happen no matter what enterprises do, but outlined four of the best practices they can employ, regardless:
- As customers often fall victim to cyberattacks, it's important that they are notified whenever a breach occurs. This will allow them to take any steps necessary to protect their critical information.
- Ultimately, malevolent figures operating on an online platform look for intelligence pertaining to personal finances and identity-based data. Therefore, businesses should protect the servers on which such knowledge is held.
- Lobby for legislation that supplements meager IT budgets so that investment in security software can occur.
- When an attack occurs, ask for assistance from database support services and other IT professionals.
Though the down payments associated with protection software may seem intimidating, it's important for executives to recognize that the costs pertaining to a well-orchestrated data breach could cost them millions of dollars.