Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 12 hours 9 min ago

SQL Server Transaction Log Basics

Thu, 2014-04-24 07:06

Here at RDX, our internal and external clients can have varying levels of SQL Server knowledge. Explaining technical details to clients is, in my opinion, one of the most rewarding tasks we perform. Not only does this allow us to provide client-specific recommendations, it allows our relationships to evolve from ‘support personnel’ to a trusted team member. In my second blog post, I am addressing one of the topics that come up often – SQL Server transaction logs.

Transaction Logs and Their Internal Structure

Each database within SQL Server has at least one transaction log file. The default file extension for these is “LDF”. The transaction log files are internally split into virtual log files (VLFs). VLFs are created when a transaction log file is created and when the transaction log file grows. With improper autogrowth settings or within large transaction log files, the number of VLFs can grow to the point that performance degrades. You can identify the number of VLFs in a database by running this command and noting the number of rows returned:

DBCC LogInfo

Recovery Models and Their Effects

Some data is transient or can easily be rolled forward through your application. For these databases, the Simple recovery model can be used. Full and differential backups are the available backup types. Records in the transaction log file are cleared once the transaction completes. In other situations, you need to minimize data loss, also referred to as ‘point-in-time’ recovery. Using the Full recovery model in conjunction with full, transaction log, and, at times, differential backups allows you to meet this need. Records in the transaction log remain in the file until a transaction log backup is taken. Without transaction log backups, databases using the Full recovery model will experience exponential transaction log file growth. During large bulk operations, the bulk-logged recovery model can be used. This minimizes the amount of logging taking place in the transaction log and, therefore, your disk space requirements. Point-in-time recovery is not supported using this recovery model. If point-in-time recovery is required, transaction log backups should be taken immediately before changing the database to bulk-logged recovery and after the database is changed to use Full recovery.

How Transactions are Tracked

Within the VLFs are records of all transactions that occur within its database, marked with log sequence numbers (LSNs). These LSNs are important for recovering your data to a point in time if a disaster or failure occurs. All backups begin with creating a header containing the LSN details. This information can be used to ensure you have all of the backups you need to recover. Restoring backups out of order, skipping a restore of a transaction log backup, and restoring only transaction log backups are not supported. If a recent transaction log backup is deleted, it will affect your ability to recover, and a new full backup should be taken as soon as possible. If you suspect a transaction log backup has been deleted, you can compare the header details below.

  • FirstLSN
  •       -LSN at the beginning of the backup

  • LastLSN
  •       -LSN at the end of the backup
          -With transaction log backups, LastLSN can be compared to the FirstLSN int he suspected next transaction log backup to confirm continuity.

  • DatabaseBackupLSN
  •       -FirstLSN of the last full database backup

  • DifferentialBaseLSN
  •       -FirstLSN of the last full database backup

    Transaction logs are a critical concept to master when working with SQL Server. Without proper attention they can impact the performance of your databases or affect your ability to recover from a disaster situation. Discussions between DBAs (‘accidental’ or otherwise) and application or business decision makers need to occur periodically to ensure that minimal data is lost outside of the SLA.

    I hope this post has provided you with useful insight into your SQL Server transaction logs. In the coming weeks, Matt Nelson will be explaining how to manage the number of VLFs in your databases. Stay tuned!

    Windows XP switching to Linux?

    Wed, 2014-04-23 11:17

    To the chagrin of Microsoft executives, a number of business professionals still using the now unsupported Windows XP operating system are considering switching to Linux, which offers free software spawned by the open source development process. Though not necessarily the most popular OS solution, database experts are now praising Linux as both operable and effective. 

    Just make the upgrade
    InfoWorld contributor and avid Microsoft enthusiast Peter Bruzzese wrote that, although he understands why XP users are reluctant to move to Windows 8, there's little reason why they should avoid transitioning to Windows 7. For one thing, the user interfaces of the latter deployment and XP are virtually the same – it appears as if an XP user made a simple, rudimentary update. Essentially, there's already going to be a level of familiarity with Windows 7, and companies don't have to worry about employees getting acclimated to a new level of operability.

    Although a number of database administration and IT professionals have lauded Linux for vastly improving its modern, signature OS, Bruzzese isn't convinced. For one thing, those used to working with software compatible with Windows deployments will have to switch to new deployments former XP users would be vastly unfamiliar with. This means that training in specific programs may be required – an investment a number of companies are unable to make.

    Taking a chance?
    Yet there are some professionals, such as San Francisco Gate contributor David Einstein, who believe there's untapped opportunity in Linux. Despite the fact that deployment runs many large enterprise and Web servers and is the basis for Android, Linux hasn't gained a lot of traction with those looking for a desktop OS, primarily due to two factors:

    1. Past models weren't as easy to use as Mac OS or Windows
    2. Not enough software was produced to run on it.

    However, Einstein noted that a particular version of Linux is customized to replace both Windows XP and 7. Under the brand name Zorin, the OS mimics the user interfaces of the latter two solutions, contrasting Bruzzese's conclusion that XP users won't be able to adapt as easily to any Linux OS. Linux provides prospective users with the chance to download Zorin onto hard disks for free. Furthermore, the system offers user-friendly software such as a Microsoft-compatible office suite, a photo editing program and the Chrome browser.

    Due to the liberal pricing and familiar formatting, there's a good chance enterprises will choose Zorin to replace their unprotected XP deployments.

    Oracle seeks to build Oregon health insurance exchange

    Wed, 2014-04-23 01:28

    Among a list of other reformations, the United States Affordable Care Act requires each state to construct a health insurance exchange to broker and manage care packages, measure eligibility, automate enrollment and electronically transfer patient information between participating entities. Oregon officials have relied on Oracle's database experts to build such a system, but may be considering a more consultative approach.  

    According to a report released by Oracle, Oregon's Department of Human Services perceived ACA mandates as an opportunity to upgrade its IT infrastructure and improve the organization's delivery of services. Before the health care legislation was signed, DHS was already in the process of upgrading programs that assisted impoverished families with dependent children, but was using an inadequate legacy system to determine citizen eligibility for Medicaid.

    Oracle's assistance
    In addition, the federal legislature obligated the DHS to connect patients with coordinated care organizations – networks of physicians, dentists, mental health professionals and other treatment providers who have collaborated to support consumers who will receive benefits from insurance exchanges. It was evident that such an operation required the expertise of database administration professionals. 

    As a result, the DHS used the Oracle Enterprise Architecture Framework, as well as the IT developer's Architecture Development Process. Both of these elements provided a structure for updating the state's legacy infrastructure and ensured that any future developments would be adequately supported. Bob Ladouceur, information systems applications manager at Oregon DHS, claimed that the enterprise architecture enabled professionals to coordinate multiple activities within a versatile initiative to stay focused. 

    Mitigating contention 
    The result of the project was the development of a health insurance exchange website appropriately named Cover Oregon, which according to Oracle officials isn't fully operational. According to Statesman Journal, the state has blamed the database support services company for not meeting contract expectations even though Oracle stated that Cover Oregon would not be ready for launch until October 1. Despite warnings issued by Oracle President and Chief Financial Officer Safra Catz, the state launched the website's services several months before the scheduled release date. 

    Although Oregonian officials claimed that Oracle continued to reassure DHS that the website was complete, Oracle responded by noting that the state rushed the process. This level of contention has led DHS to sever its partnership with the database architecture company, a move that many IT professionals perceive as unwise. As opposed to resolving the issue, Oregon is searching for database administration services to help them proceed. 

    Governments getting serious about database security

    Tue, 2014-04-22 01:38

    Complex networking, cloud computing and a list of other IT technologies have made data protection all the more difficult. The United States and Canada have implemented security compliance standards, but database experts are claiming that governments need to move past basic regulations and employ more intricate, thorough defense practices.

    According to InfoWorld, Canadian authorities recently arrested 19-year-old Stephen Solis-Reyes for utilizing the Heartbleed bug to steal taxpayer information from the Canada Revenue Agency's website. CRA stated earlier this week that the vulnerability was leveraged to steal the Social Insurance Numbers of about 900 people, which caused the agency to temporarily prohibit citizens from filing online tax returns. The delinquent was charged with one count of unauthorized use of a computer and one count of mischief in relation to data. 

    The fact that a reprobate as young as Solis-Reyes infiltrated a government website shows just how dexterous modern cybercriminals have become. For this reason, many public authorities have outsourced to remote database support providers capable of devoting an incredible amount of manpower and resources to deterring network intrusions. As it becomes more difficult to physically steal financial information, more deviants will turn to the Web and other IT-based avenues to obtain confidential intelligence. 

    Moving past the basics 
    It's not easy to gain access to a government database, but cybercriminals are quite capable of adapting to an environment known for its versatility. CIO noted that U.S. federal CIOs claimed that making infiltration difficult poses a challenge – public employees still need to connect with digital information. Simon Szykman, CIO at the Department of Commerce, stated that an answer to this conundrum is automated surveillance, enabling networked assets to search for and report potential security breaches. 

    "We're now moving toward an area of much more automated and near real-time situational awareness where we have systems that themselves are able to verify that controls are being implemented," said Szykman, as quoted by CIO.

    The National Oceanic and Atmospheric Administration told the news source that its database administration staff collects digital information from more than 20,000 devices. Due to automated monitoring, all of that data becomes centrally aggregated and analyzed, meaning that 1 billion events are processed every day. NOAA CIO Joe Klimavicz claimed that the organization blocks almost half a million malevolent Web connections each week. 

    Automated processing allows in-house IT departments and researchers to focus on developing new security techniques while maintaining a continuous overview of all network activities. 

    Contemporary enterprise databases grow more complex

    Mon, 2014-04-21 05:14

    Between merchandisers obtaining data through e-commerce applications and industrial developers searching for ways to optimize critical infrastructure grid performance, database experts would agree that the complexity of the modern database has expanded. Professionals typically think of scalability when they refer to the changing environment, but it's more applicable to scrutinize the programming languages and analytics applications used by companies. 

    Diversified communications 
    For IT professionals, using multiple languages to submit commands or evaluate digital information can be an arduous task. InfoWorld contributor Paul Venezia noted that computer technicians typically subscribe to a single form of communication, but wear out the orders and practices that are used the most. Using a different transaction method means they have to adjust to a new way of completing tasks. 

    The routine of using the same language can make professionals lose sight of the logic behind it. For example, if the same command were viewed from a different mode of communication, the person scrutinizing the order may realize that there was a faster, optimal way of executing it. Experienced database administration personnel are often multi-lingual, capable of understanding the subtext within particular directions given to the server. Due to the wide range of varying digital intelligence, it has become necessary for human IT assets to comprehend this variability. 

    Simplifying analytics 
    According to InformationWeek, enterprise data warehouse company Teradata recently released a QueryGrid data-access layer capable of orchestrating multiple modes of analysis across several platforms, including those developed and issued by Oracle. Chris Twogood, vice president of product and services marketing for Teradata, noted that the program can automatically conduct analytics tasks without constant surveillance from human assets.

    "Users don't care if information is sitting inside of a data warehouse or Hadoop and enterprises don't want a lot of data movement or data duplication," said Twogood, as quoted by the news source. "The QueryGrid gives them a transparent way to optimize the power of different technologies within a logical data warehouse."

    As an example, Twogood cited a task database administration services would conduct for a retailer. Essentially, QueryGrid would enable merchandisers to find high-value customers on the Web, then input their comments on Facebook or Twitter into Hadoop. The program would then collect negative sentiments regarding the company using the program and correlate that data with the more favorable consumer in order to deduce how the individual can influence people against churning. 

    Although the intricacy of today's digital information is expanding, so are the software programs used to organize and analyze it all. 

    Charting the effects of Snowden’s NSA revelations

    Mon, 2014-04-21 05:08

    There's no question that the information disclosed by Edward Snowden regarding the United States National Security Agency's surveillance techniques has shaken the technological world. As a result, many domestic and foreign businesses are instructing their database administration providers to reevaluate their active cloud deployments. 

    ​Remaining apprehensive
    NSA Aftershocks recently queried 1,000 C-suite professionals specializing in information and communications technology and cloud computing. The study showed that approximately 88 percent of CIOs are rethinking their cloud purchasing behavior, with 38 percent revising acquisition contracts previously established with vendors. Out of all the respondents, a mere 5 percent believe that it doesn't matter where enterprise data is stored. 

    In addition, the report discovered that the majority of corporations are solidifying contracts with cloud providers located domestically. A desire for closer proximity and a better overview of government legislation is driving this factor. Although corporations are unwilling to sacrifice cloud operability, executives are placing more emphasis on protection. About 84 percent of those surveyed by NSA Aftershocks reported that they are consulting with database experts to train in-house personnel in both rudimentary and advanced cloud security. 

    "The vast majority of respondents agreed that extra investment would go towards added security measures, training was also seen as a key priority," the study acknowledged. 

    A double-edged sword 
    Despite the fact that many enterprises are battening down the hatches, maneuverability cannot be abandoned. By allowing employees unrestricted access to data, corporations will be able to fluidly make key business decisions. However, as many workers choose to obtain company information via mobile devices, the surveillance responsibilities of database support services become ever more complicated. 

    Praveen Thakur, a contributor to online magazine The Nation, claimed that security professionals are executing in-depth, multi-layered approaches to data defense as opposed to employing conventional techniques that are largely ineffective in the face of complex communications technology. Instead of constructing bulwarks designed for the sole purpose of deterring threats, DBA services are developing protection methods that consistently prevent, detect and manage cyber threats.

    Due to the fact that many enterprise employees use disparate applications and software to interact with digital intelligence, Thakur recommended outsourcing to IT professionals who can administer comprehensive protective measures. Collecting a variety of different security solutions to resolve separate issues can clutter an operation and actually do more harm then good by congesting system tools and features. 

    Business professionals consider moving to Office 365

    Thu, 2014-04-17 02:00

    Many executives favor Microsoft products over competing software. Since its inception, the corporation has established itself as a developer of business-standard technology, with millions of subscribers distributed throughout the world. Due to recent improvements spearheaded by new CEO Satya Nadella, many organizations previously unfamiliar with the company's products are implementing Microsoft solutions with the help of database administration services

    Releasing a more affordable product 
    Pete Pachal, a contributor to technology blog Mashable, noted that Microsoft began selling Office 365 Personal earlier this week for $6.99 a month, accommodating subscribers with applications such as Word, Excel, PowerPoint and Outlook, among others. In contrast to the solution's counterpart, Office 365 Home, Personal only allows users to install the program on a single PC or Mac. However, the offer makes sense for enterprises working primarily with such machines. 

    Personal's integration with Microsoft's cloud solution, One Drive, enables employees to share, store and edit files seamlessly. As this process expedites business operations, senior-level management may consider Office 365 to be a viable option for satisfying the needs of their departments. For those looking to abandon products manufactured by Microsoft's competitors, however, the transition may be easier said than done. 

    Steps for migration 
    Moving a large variety of email into Office 365 may require the assistance of database administration professionals. According to InfoWorld contributor Peter Bruzzese, corporations need to consider what information should be transitioned into Outlook, where that data is stored and whether or not it will be manipulated after all digital intelligence is successfully relocated. In order to ensure a smooth transfer, Bruzzese recommended making the following considerations:

    • Perform a preparatory review of all messaging needs and orchestrate a plan that will supplement those requirements. 
    • If a company is migrating from Exchange, database support services can it transfer all on-premise data into the cloud through Exchange Web Services, which allows users to export 400GB a day. 
    • Those relocating data from Google, Network File Systems or Notes should consider using Archive360, which can filter data through Exchange and then transfer it into Office 365.
    • Companies transitioning email data from GroupWise could  find solace in funneling the information through Mimecast and connecting the storage with Office 365 mailboxes. 

    Obviously, a command of certain programs is required, depending on what kind of route an organization chooses. For this reason, consulting database experts may be the best option. 

    Retailers use business intelligence to optimize supply chain

    Wed, 2014-04-16 10:07

    Due to the prevalence of omnichannel retail, merchandisers are obligated to satisfy the inventory fulfillment requirements of brick-and-mortar stores and consumers. Instead of using human resources to scrutinize the distribution process, commodity-based companies are hiring database experts to implement business intelligence tools capable of providing actionable information regarding the supply chain.

    What's redefining modern delivery systems?
    E-commerce has allowed corporations to deliver their products to consumers residing in various parts of the country, creating variable demand for particular items. In order to anticipate customer desires, data analytics tools are being used to chart regional weather conditions, translate online surveys and monitor the distribution of materials. Jim Rice, a contributor to Supply Chain 24/7, stated that while transportation and storage processes cannot change the specifications of an item, they can revolutionize the way in which that particular product is delivered to a customer.

    For example, a customized, direct-to-order method can transform consumer expectations. People don't want to wait to receive their purchased materials, even though an unspoken covenant was established the minute they finalized the order on a website. Therefore, database administration personnel employ programs that scrutinize which areas of the supply chain can be optimized to ensure that products are delivered as promptly as possible. The patterns these software solutions recognize are often overlooked by human eyes.

    Enhancing global sourcing
    Database engineering company Oracle recently announced the introduction of Oracle Global Trade Intelligence, a global commerce analytics application that provides organizations with the ability to leverage worldwide sourcing and distribution data to measure, predict and optimize the performance of their supply chains. Released in February, the program contains modifiable dashboards that enable enterprises to construct user-defined trade performance measurements that scrutinize import and export activities throughout the world.

    Oracle experts and sourcing professionals are thrilled with the release, which also offers executives the chance to streamline communications between overseas departments. This process is expected to ensure that all materials are properly tracked, significantly reducing the chance of losing vital products. In addition, the program gives strategists the ability to anticipate the actions of both foreign and domestic competitors.

    "Organizations are moving beyond automation of their global trade processes and are seeking ways to leverage their global trade data to make better business decisions," said Vice President of Value Chain Execution Product Strategy Derek Gittoes.

    In the age of global commerce, it's imperative that companies possess programs akin to Oracle Global Trade Intelligence in order expedite the shipment of goods and reduce the cost for such products on the consumer's end.

    Database experts try to mitigate the effects of the Heartbleed bug

    Tue, 2014-04-15 01:44

    Recently, the Heartbleed Bug has sent a rift through global economic society. The personal information of online shoppers, social media users and business professionals is at risk and database administration providers are doing all they can to either prevent damage from occurring or mitigate detrimental effects of what has already occurred. 

    What it does and the dangers involved
    According to Heartbleed.com, the vulnerability poses a serious threat to confidential information, as it compromises the protection Open Secure Sockets Layer/Transport Security Layer technology provides for Internet-based communications. The virus allows anyone on the Web – particularly, cybercriminals – to view the memory of the systems protected by affected versions of OpenSSL software, allowing attackers to monitor a wide array of transactions between individuals, governments and enterprises and numerous other connections. 

    Jeremy Kirk, a contributor to PCWorld, noted that researchers at CloudFlare, a San Francisco-based security company, found that hackers could steal the SSL/TSL and use it to create an encrypted avenue between users and websites, essentially posing as legitimate webpages in order to decrypt traffic passing between a computer and a server. For online retailers lacking adequate database support services, it could mean the divulgence of consumer credit card numbers. If customers no longer feel safe  in purchasing products online, it could potentially result in the bankruptcy of a merchandiser. 

    Think mobile devices are safe? Think again 
    Now more than ever, database experts are making concentrated efforts to effectively monitor communications between mobile devices and business information. As the Heartbleed Bug can compromise connections between PCs and websites, the same risk is involved for those with mobile applications bridging the distance between smartphones and Facebook pages. CNN reported that technology industry leaders Cisco and Juniper claimed that someone can potentially hack into a person's phone and log the details of his or her conversations. Sam Bowling, a senior infrastructure engineer at web hosting service Singlehop, outlined several devices that could be compromised:

    • Cisco revealed that select versions of the company's WebEx service are vulnerable, posing a threat to corporate leaders in a video conference. 
    • If work phones aren't operating behind a defensive firewall, a malicious entity could use Heartbleed to access the devices' memory logs. 
    • Smartphone users accessing business files from iPhones and Android devices may be exposed, as hackers can view whatever information a person obtained through select applications. 

    Upgraded deployments of OpenSSL are patching liable avenues, but remote database services are still exercising assiduous surveillance in order to ensure that client information remains confidential. 

    Mobile device management is a two-sided battle

    Mon, 2014-04-14 01:39

    The rise of the Internet of Things and the bring-your-own-device phenomenon have shaped the way database administration specialists conduct mobile device management. Many of these professionals are employed by retailers using customer relationship management applications that collect and analyze data from smartphones, tablets and numerous other devices. This level of activity creates a web of connectivity that's difficult to manage and often necessitates expert surveillance. 

    Managing the remote workplace 
    Merchandisers are challenged with the task of effectively securing all mobile assets used by their employees. Many of these workers have access to sensitive corporate information, whether it be a product development files, customer loyalty account numbers or consumer payment data. According to CIO, some organizations lack the in-house IT resources to effectively manage the avenues through which intelligence flows from smartphones to servers. 

    As a result, small and midsize businesses often outsource to remote database support services to gain a comprehensive overview of their BYOD operations. David Lingenfelter, an information security officer at Fiberlink, told the news source that the problem many SMBs face is that their employees are using their own individual mobile devices to access company information. Many large enterprises often provide their workers with such machines, so there's inherent surveillance over the connections they're making. 

    Moving to the home front 
    Small, medium and large retailers alike are continuing to use CRM, which provides these commodity-based businesses with specific information regarding individuals. IoT has launched the capabilities of these programs, delivering data from a wide variety of smart mechanisms such as cars, watches and even refrigerators. Information being funneled into company servers comes from remote devices, creating a unique kind of mobile device management for database administration services to employ. 

    Frank Gillett, a contributor to InformationWeek, noted that many consumers are connecting numerous devices to a singular home-based network, providing merchandisers with a view of how a family or group of apartment mates interacts with the Web. In addition, routers and gateways are acting as defaults for making network-connected homes ubiquitous. 

    "These devices bring the Internet to every room of the house, allowing smart gadgets with communications to replace their dumb processors," noted Gillett.

    However, it's not as if the incoming information submitted by these networks can be thrown into a massive jumble. In order to provide security and organize the intelligence appropriately, remote DBA providers monitor the connections and organize the results into identifiable, actionable data. 

    Database experts contend with clients using Windows XP

    Wed, 2014-04-09 14:23

    Despite the fact that fair warning was given to Windows XP users several months before Microsoft announced that it would terminate support services for the outdated operating system, a large number of businesses continue to use it. Citing security concerns, database administration services have urged these professionals to make the transition to Windows 8.1.

    Why it's a concern
    The last four patches were delivered to XP users on April 7. Michael Endler, a contributor to InformationWeek, stated that the the 12-year-old OS still has twice as many subscribers as there are for Windows 8 and 8.1 combined. It's believed that general reluctance to switch to the new systems is rooted in how comfortable XP users have become with the solution. The problem is, IT professionals are expecting hackers to launch full-scale assaults on the machines hosting these programs in an attempt to harvest information belonging to individuals, as well as the companies they're working for. 

    To the dismay of consumers, a fair number of banks and other organizations handling a large flow of sensitive customer data are still using XP. However, many of these institutions have hired the expertise of database support services to provide protection and surveillance for their IT infrastructures. Endler noted that select XP subscribers will still receive backing from Microsoft, though they'll be shelling out millions of dollars for the company to do so. 

    Making a push for the new OS
    In an effort to convince others to switch to the new Windows 8.1 update, Microsoft took a couple of strategic initiatives. Firstly, the corporation offered $100 to XP users still operating through the 12-year-old system to help consumers cover the cost of obtaining up-to-date machines. In addition, CIO reported that Windows 8.1 users won't receive patches or other future updates for the OS unless they install the new update. In other words, if businesses don't favor the changes the company has been making to 8.1, there's no way they can receive security fixes, leaving many to rely on database administration to mitigate the issue. 

    In contrast, Windows 7 and 8 users will still continue to receive the same assortment of patches they've been accepting. Though Microsoft has garnered generally positive attention for its integration of cloud and mobile applications into its brand, the company's business techniques have been met with criticism. It's likely that the software giant is simply employing these strategies to assert itself as a forward-acting corporation. 

    College finds success with Oracle’s business intelligence tools

    Tue, 2014-04-08 14:02

    Professionals who believe that business intelligence is simply another buzz phrase thrown around by database experts are often left at the bottom of the totem pole of corporate production. These naysayers often perceive analytics tools to be an extraneous expense, but the technology is in fact becoming a necessity for corporations intent on surviving in an increasingly competitive market. 

    Reducing the amount of transfers
    According to Campus Technology, Valdosta State University in Georgia recently improved its overall student retention rate thanks to business intelligence applications provided by database heavyweight Oracle. Before implementing the solutions, the institution retained a mere 67 percent of its first-year student body, costing the University an estimated $6.5 million in annual revenue. 

    With the assistance of a database administration service, the organization began integrating the analytics tools in April 2012 in a two-part transition procedure:

    • In the first phase, VSU implemented Oracle Business Intelligence Enterprise Edition, a program possessing interactive dashboards, ad hoc queries and strategy management. 
    • The second stage occurred in the fall of 2012, in which the university launched Oracle Endeca Information Discovery. The software enabled administrators to collect and scrutinize student data from various sources.

    After correlating the information gathered from Enterprise Edition and Information Discovery, VSU database administration noticed that pupils who eat breakfast on campus have a 10 percent higher retention rate than the majority, while freshmen who worked on campus had a 30 percent greater chance of staying at the school. As a result, the institution promoted on-campus eateries and invested $200,000 in student employment. A year later, VSU kept 2.5 percent more students than it did in previous years. 

    Interest is increasing 
    Interest in the technology has increased over the years, especially among companies in the retail industry. TechNavio reported that the global business intelligence market in the merchandising economy is anticipated to grow at a compound annual growth rate of 9.19 percent over the next four years. Due to the incredible volume of data retained by commodity-based businesses, remote database support providers are introducing more complex data processing tools to their systems. 

    "Basically, BI means getting the right information to the right people at the right time to support better decision-making and gain competitive advantage," TechNavio noted. 

    Universities across the U.S. strive to make their institutions more appealing than their rivals. This could mean lowering tuition rates, promoting certain curricula or renovating dorms. However, school administrators could get a better idea of where to allocate resources after consulting the algorithmic conclusions of BI. 

    How the Internet of Things will transform DBA services

    Tue, 2014-04-08 01:48

    Contemporary data warehouses are going beyond the basic store-and-save capabilities IT departments are used to seeing. However, due to increased usage of data collection and analysis tools, database administration services have managed more complex infrastructures that are better able to host these programs. An increase in server usage and action diversity has created an intricate environment demanding more assiduous maintenance and surveillance than was previously necessary. 

    The next big thing? 
    Teradata, a company providing information analysis architectures, recently introduced Teradata Database 15, which promises to deliver a logical, more intelligent data repository. Scott Gnau, president of Teradata Labs, noted that the company is offering users the chance to orchestrate analytic queries across multiple systems, incorporate and consume JavaScript Object Notation and run non-SQL languages. 

    The organization's push for JSON is rooted in the prevalence of the Internet of Things. Companies are now using smart devices to amass millions of data points derived from the unique perspectives of each mechanism. The database experts claimed that the open standard format can offer organizations the agility needed to remain competitive. Business models, marketing campaigns and project developments can be quickly assembled from the human-readable text of JSON. 

    To the satisfaction of innovative database administration professionals, Teradata 15 is expected to possess greater application development features. Without having to attain new parallel programming skills, creators can now access a more diverse array of data and construct programs through a robust environment. 

    Organizing the disordered 
    One aspect of IoT data is that it is largely raw, unstructured and unorganized. As a result, IT-driven corporations are reassessing the value of NoSQL databases, which have been built to better handle the digital information produced by a wide array of smart devices, websites and other resources. According to Forbes, Oracle experts have claimed that this newfound interest presents a great opportunity to the world's second largest software company. The source noted IDC Research's prediction that 90 percent of all data produced this decade will be disorganized. 

    A traditional relational database management system is incapable of processing the heterogeneous, non-alphanumeric data that has grown quite prevalent of late. Forbes acknowledged the value of deploying a blog, which may possess carefully placed advertisements that drive the proprietor's revenue. Database analytics tools that skillfully select these instant marketing campaigns can only be supported by NoSQL, as it offers users horizontal scaling. 

    RDBMS is slowly fading out of the mixture, giving way to a new breed of operations better acclimated to the current disorganized data climate. In turn, database support services will answer with new assessment and management tools capable of handling NoSQL operations. 

    Shifting preferences may change the priorities of database experts

    Fri, 2014-04-04 10:01

    Businesses are realizing that investment in data analytics tools can be a major boon to their market intelligence divisions. Digital information collected from smartphones, tablets and other devices is contributing to a seemingly limitless vat of knowledge for executives looking to launch the next big product or service. Due to the sheer scale and complexity of such an operation, corporations are outsourcing their IT responsibilities to database administration services capable of giving them a comprehensive view of all market and in-house insight.

    Connecting to more, in unlikely places
    Innovators have speculated that a smart refrigerator may emerge in the not-too-distant future, capable of providing owners with recipes obtained from the Internet based upon what food is being stored in the machine. It's this kind of intelligence that could potentially revolutionize the food industry. Grocery stores may build entire marketing campaigns based upon what edibles are most popular. Chain restaurants may use these devices to store their products and funnel the information into customized analytics tools designed to create new recipes.

    Behind these developments is the Internet of Things. Lori MacVittie, a contributor to InformationWeek, claimed that the IoT will continue to be integrated with wearables, children's toys, pens and other items, a process that's sure to revolutionize the contemporary data center. However, she noted that harnessing these assets won't be an easy endeavor, even for the most capable enterprises. Ultimately, two optional procedures will most likely become necessities. First, database experts will need to be consulted. Second, the appropriate applications will need to be used in order to process the influx of information.

    How the consumer will use it
    It's generally understood how corporations will harness this technology, but many are still speculating on how consumers will interact with devices connected to IoT. According to Business Insider contributor Emily Adler, the instruments people use on a daily basis will enter the widespread data environment. She noted that home appliances may be consulted by homeowners to determine how much energy a household is spending, the residual effect being that an individual can determine how he or she can reduce their electric bill.

    As fitness is an ongoing trend that is likely to remain prevalent over the next couple of decades, chain gyms and exercise centers could ask their customers to connect their wearable devices to their corporate databases. Already, these mechanisms are capable of recording how many steps a person has taken as well as their weight, blood pressure and other statistics.

    This phenomenon will likely result in a trade-off between consumers and businesses, the latter consulting database administration support to harness programs capable of translating customer data into actionable intelligence.

    Corporations seek to find optimal database security

    Thu, 2014-04-03 01:46

    Though it may sound counterintuitive, a number of database experts have claimed that a company may benefit from disclosing information regarding its IT infrastructure to competitors. This may seem like a network security nightmare in and of itself, but collaborating with other market participants may provide valuable insight as to how organizations can deter cybercriminals. Others prefer to stick with improvements issued by established professionals. 

    Applying updates 
    Possessing quality database protection is being seen more as a profit-driver than an expense, primarily due to the fact that if digital information is stolen from a corporate server, it could potentially result in millions of dollars in losses. It's no surprise that database administration services are being consulted now more than ever. In addition, the makers of the products these professionals interact with have assessed security concerns and sought to mitigate potential problems. 

    Oracle NoSQL Database 3.0 was recently released, with improved performance, usability and safeguards. The upgrade utilizes cluster-wide, password-based user authentication and session-level SSL encryption techniques to deter cybercriminals from hacking into a company infrastructure. Andrew Mendelsohn, executive vice president of database server technologies for Oracle, claimed that that it helps remote DBA personnel construct and deploy state-of-the-art applications in a secure environment. 

    Walking around naked 
    Corporations often misunderstand the advice of IT professionals to share security protocols with their competitors. It's not about exposing weaknesses to cybercriminals and providing them with a comprehensive framework of the database's infrastructure, it's about collaborating with like-minded executives attempting to find a solution to an issue that isn't going to disappear. 

    Evan Schuman, a contributor to Computerworld, cited Full Disclosure, an online community through which database administration support, C-suite personnel and IT professionals could publicly report network breaches and discuss methods through which security problems could be resolved.

    Due to the fact that gray hat hackers could access the forum, researchers would notify software companies at least 30 days prior to posting on the website so that the developers could apply the appropriate patches beforehand. This kind of initiative identified problems before cybercriminals could exploit them. Unfortunately, to the dismay of its participants, rumors have been circulating that Full Disclosure will shut down in the near future.

    "By not having this place to expose them, the vulnerabilities will remain hidden longer, they will remain unpatched longer, yet the attacks will keep coming," said an anonymous security manager for a retailer. 

    Ultimately, black hat hackers have extensive communities through which they can share the same kind of information professionals posting to Full Disclosure are. If the website goes dark, cybercriminals will still have networks of communication, while law-abiding IT industry participants will not. 

    Organizations using data to track environmental change

    Wed, 2014-04-02 10:58

    In light of a study recently released by the Intergovernmental Panel on Climate Change, the database administration needs of public agencies and organizations are expected to expand significantly. As it was industrialization and innovation that incited this worldwide issue, the Internet of Things will continue to be used to identify the detrimental effects climate change has on particular ecosystems and economies of the world. 

    Patrick Thibodeau, a contributor to Computerworld, claimed that the IPCC's study acknowledged the importance of sensor networks to monitor the shifting global environment. Potentially, these devices could help government officials anticipate droughts, floods and natural disasters caused by rising temperatures. In addition, it is hoped that the mechanisms will identify ways to preserve water and food supplies as well as determine methods for reducing energy consumption. 

    If public authorities choose to acknowledge the recommendations of the IPCC, the influx of new data derived from the IoT is sure to increase network traffic, requiring the expertise of remote database support to ensure that all analytics programs are running efficiently. As it's somewhat ambiguous as to how these sensors will be deployed, the kinds of avenues through which information flows into networks may pose as a challenge to in-house IT departments.

    An example of a new innovation 
    The types of devices the government and non-profit environmental agencies use are quite variable. Some may track the shifting tides across the Atlantic and the Pacific while others will determine the acidity of farming soil. If all the data collected by these devices is assembled onto a single server, outsourced database experts may be consulted to mitigate it all. It looks as if scientists have already taken the first step. 

    According to Space Daily, engineers from Europe developed the Sentinel-5 instrument, a mechanism which allows the continent's Copernicus program to monitor air quality around the globe. The article noted that atmospheric pollution is linked to millions of deaths around the world. 

    "The readings will help to both monitor and differentiate between natural and human-produced emissions, providing new insight on the human impact on climate," noted the news source. 

    Amassing and translating such an incredible amount of data will most likely necessitate the expertise of remote DBA to ensure that networks don't crash or overload. It's hoped that Copernicus, the world's first operational environmental surveillance system, will provide scientists with specific insight on how the earth's population can reduce emissions. 

    The Merge Join Transformation and a SSIS Sorting Tip

    Wed, 2014-04-02 06:43

    My previous blog post was about the SSIS Lookup task and how it really works. Now that I have shown that the Lookup task shouldn’t be used for one-to-many or many-to-many joins, let’s take a look at the Merge Join transformation task. If you follow along with this blog, you will learn a little tip that will eliminate the requirement for you to add a SORT transformation task within your data flow task.

    Previously, we isolated our results sets down to one employee in the AdventureWorks database and joined two tables together. I’m going to do the same thing here. This time, I am going to introduce a third table into the join logic. The three tables are listed below:

    • Employee Department History
    • Person
    • Department

    Here is what it would look like via SSMS with T-SQL:

    Image and video hosting by TinyPic

    Let’s see how we can mimic this result set in SSIS without using T-SQL to join the three tables. First, I want to say it is not always going to be the best option not to use T-SQL instead of individual tasks in SSIS. I have learned over time that it is easier to write the join logic directly in you data flow source task sometimes. However, this is for demonstration purposes.

    Let’s say you received a request to extract a result set, order the results set, and load it to another location. Here is what your package would look like in SSIS using the Merge Join transformation task:

    Image and video hosting by TinyPic

    Here are our results:

    Image and video hosting by TinyPic

    Notice, I used the SORT transformation task in the example above. I used this to depict what has to occur in a step by step approach:

    • Extracted data from the Person and Employee Department History tables
    • Sorted each result set
    • Merged the two results into one using inner join logic
    • Extracted data from the Departement table
    • Sorted the first Joined result set and the Department result set
    • Merge the Joined result set from Persons and Employee History with the Department table

    Let’s talk about best practice for this example. This is where the Sort tip is introduced. Since we need an ordered result set per the request, we are using the merge transformation instead of the Union All task. Additionally, we used the Sort task. The Sort task can heavily impact the performance of an SSIS package, particularly when you have larger result sets than what we are going to extract from the AdventureWorks database.

    Best practice is to bring in an ordered result set at the source and then merge your record sets. Well, how do you do that? Let’s walk through ordering your result set at the source and configuring your source to define the sorted column for merging your record sets.

    First, we open the task and add our ORDER BY clause to our source.

    Image and video hosting by TinyPic

    Next, close the source task, right click on the same source task, and choose the Show Advanced Editor option.

    Image and video hosting by TinyPic

    There are two specifications in the Advanced Editor that need to be defined in order to make this work:

    • Click on the Input and Output Properties tab
    • Click on the OLE DB Source Outputs
    • Change the IsSorted parameter to “True”

    Image and video hosting by TinyPic

    • Drill down into the OLE DB Source Output to Output columns.
    • Click on your columns that you used in your ORDER BY clause.
    • Change your SortKeyPosition parameter from “0″ to “1″.

    The desired results should look similar to those below:

    Image and video hosting by TinyPic

    Next, you can remove each sort task that directly follows your OLE DB Source task by repeating the steps above to reconfigure each source editor. Now, my data flow task looks like this:

    Image and video hosting by TinyPic

    We get back the same results:

    Image and video hosting by TinyPic

    In case you are wondering why I got rid of all of the Sort tasks except for the one that follows the first merge join, I’ll explain. There are two reasons for this. My second join is on DepartmentID and, most importantly, the merge transformation task is not considered a data flow source task and does not come with the functionality to define the sorted order.

    To conclude my second blog post of this series, the Merge Join transformation task can be used to merge columns from two different tables using Join logic similar to the Joins that can be used in T-SQL. We have looked at a step by step break down of what has to occur to implement a Merge Join transformation task as well as discussing some tips and best practice in regards to using the Sort task in SSIS.

    I hope this blog post has been informative and that you look forward to reading my third post soon.

    Omnichannel retail changes the face of the merchandiser’s database

    Tue, 2014-04-01 09:48

    Retailers that have failed to adapt to the e-commerce landscape are seemingly destined for failure. Those that have executed an omnichannel product delivery approach have implemented complex data analytics programs to provide them with valuable market insight on both individual customers and groups of people. Helping them effectively manage this software are database experts well acquainted with the technology. 

    Meeting expectations 
    Although online shopping has driven profits for merchandisers, it's also presented them with a list of new problems. One challenge that has evolved with the prevalence of e-commerce is reaching customer satisfaction. Back in the days when the only place to purchase items was in a brick-and-mortar store, it was enough to deliver a product that functioned the way it was supposed to at a reasonable price. Now, retail websites are expected to possess customized marketing campaigns for particular visitors and offer more rewards to loyal customers. 

    Meyar Sheik, CEO of personalization software vendor Certona, claimed that without the appropriate data and actions to target shoppers with relevant, applicable information, it becomes very difficult for merchandisers to execute an effective omnichannel strategy. In this respect, possessing the programs capable of managing and translating such a large amount of data is just as much a part of the the customer relations strategy as product development. 

    Leaving it to the professionals 
    As more retail executives are more concerned about the intelligence derived from the data, many have hired database administration services to effectively implement and run the data analytics programs. In a way, these professionals do more than maintain a corporate network, they provide the expertise and tools necessary to keep the foundation of a business profitable. 

    C-suite merchandisers aren't ignorant of this fact, either. According to a report released by research firm IDC, retailers are expected to heavily invest in big data and analytics projects in 2014, requiring the invaluable knowledge of IT services providers. In addition, the study showed that mobile applications connected to e-commerce operations will grow at a compound annual growth rate of 10 percent over the next three years. 

    From what can be gathered based on the latter statistic, smartphones and tablets are anticipated to be major participants in omnichannel strategies. It is very likely that database administration companies will be hired to oversee the connectivity of these devices and ensure that the avenues through which they communicate are not exploited by cybercriminals. 

    Overall, the functionality of data analytics tools and e-commerce software is dependent on IT professionals capable of assessing the needs of particular merchandisers. 

    Enterprises learn how to capitalize on big data

    Tue, 2014-04-01 01:42

    Due to the limited capabilities of a 24-person IT department faced with data analytics programs, many organizations have turned to database administration experts to monitor and operate them. Though they may not deploy the systems themselves, an outsourced service well acquainted with specific client operations can provide valuable insight for business executives looking to gain actionable digital information. 

    An unlikely friend 
    Organizations providing data analytics systems often push their products as "one size fits all" programs that may or may not be applicable to businesses engaged in different industries. Database administration services acknowledge the specific needs of each of their clients and how they intend to use digital information processing software. Some may collect real-time data points on individual shopping habits while others may be using predictive tools to anticipate product backorders during an impending snow storm. 

    According to CIO Magazine, rental-car company Hertz supplements its in-house analytics resources and data center with an outsourced IT service provider. Barbara Wixom, a business intelligence expert at the Massachusetts Institute of Technology, claimed that the nationwide organization relies on the database experts to purge unnecessary information, host and manage data and provide insights. One of the programs the company utilizes examines comments from Web surveys, emails and text messages so that store managers can get a better view of customer satisfaction. 

    Connecting with the rest of the company
    As database administrator services encounter hundreds, if not thousands of different data analytics programs in a typical work week, their personnel have obtained the invaluable ability to communicate the results of the programs to the people utilizing them. Predictive analytics tools provide actionable results, but learning how they work can be a daunting task for marketing professionals just trying to get market insight on particular individuals or populations. 

    Ron Bodkin, a contributor to InformationWeek, noted that acclimating individual departments to specific data processing actions is essential to the survival of a company. The writer cited Hitachi Global Storage Technologies, which created a data processing platform capable of hosting each team's separate needs and desires while still providing executives with a holistic view of all operations. 

    "Access to internal data often requires IT to move from limiting access for security to encouraging sharing while still governing access to data sets," claimed Bodkin. 

    The writer also acknowledged the importance of a general willingness to learn. Who better than database experts to educate unknowledgeable executives in how analytics programs operate? 

    Health care executives find challenges in new IT adoption

    Mon, 2014-03-31 01:59

    As the United States Centers for Medicare and Medicaid Services push health care providers toward electronic health record adoption, many industry leaders are finding the process to be much more difficult than the federal government anticipated. Many physicians are claiming that their in-house IT departments are struggling with implementation, while others are relying on database administration services to successfully deploy EHR programs. 

    Forcing deployment 
    As outlined by CMS, Stage 2 Meaningful Use requires all health care companies to utilize EHR systems by the end of this year. While some organizations have had better luck than others, the general consensus among professionals is that the industry was taken off guard by the mandate. Creed Wait, a family-practice doctor living in Texas, spoke with The Atlantic contributor James Fallows on a few of the issues hospital IT departments are facing. 

    In general, Wait noted that if the health care industry was ready to deploy EHR systems, participants would have done so of their own accord. By forcing hospitals and treatment centers to acclimate to software that – in a number of respects – is poorly designed, Wait claimed the current approach is counterproductive to achieving better care delivery. 

    "Our IT departments are swimming upstream trying to implement and maintain software that they do not understand while mandated changes to this software are being released before we can get the last update debugged and working," said Wait, as quoted by Fallows. 

    Let someone else handle it 
    In an effort to abide by stringent government regulations, some health care CIOs are turning to database support services capable of implementing and managing EHR programs better than in-house IT teams. According to Healthcare IT News, Central Maine Healthcare CIO Denis Tanguay noted that his workload nearly quadrupled once CMS' regulations came into effect. With just a staff of 70 employees to manage IT operations for three hospitals and 85 physician practices, Tanguay claimed that his department was buckling under the pressure. 

    "My CEO has a line: We're not in the IT business, we're in the health care business," said Tanguay, as quoted by Healthcare IT News. "This allows me to focus more on making sure that we're focused on the hospital."

    In order to resolve the issue, Tanguay advised his fellow executives that investing in a third-party database administration firm would be the most efficient way to streamline the EHR adoption process. The source reported that an outsourced entity specializing in network maintenance would be able to dedicate more resources and personnel to abiding by stringent CMS standards.