archive-au.com » AU » T » TECHTARGET.COM.AU

Total: 919

Choose link from "Titles, links and description words view":

Or switch to "Titles and links view".
  • Archiving unstructured data
    is fuzzy for Fibre drives Fuzzy future for Fibre drives Quality awards Enterprise arrays We present the results of the first ever Diogenes Labs Storage magazine Quality Awards In the inaugural product category enterprise arrays see how users rated the major array vendors and which vendor came out on top InfiniBand storage shipping soon InfiniBand storage Migrating old files curbs disk costs Archiving unstructured data by Jerome Wendt Companies must find ways to automate and simplify the process of archiving files and e mail messages ECM software addresses this large pool of unstructured data A SAN for super sleuths Bridging the gap Many disaster recovery and remote backup programs rely on an efficient cost effective WAN Fiber optic network technology is often required for long distance data transmission but you need to know what transport is best and the related implementation issues Monolithic going modular Monolithic systems go modular DR testing infrequent at best Have you tested your DR plan Storage for manufacturing Manufacturing environments typically have different storage requirements than corporate apps and have to deal with globally dispersed design teams as well as growing regulatory concerns Here s how several prominent manufacturers have met the challenge New tools to classify data by Brad O Neill Putting data on storage systems appropriate to its value requires the ability to classify data An emerging category of applications Information Classification and Management apps can index enterprise information and execute precise actions based on its content Columns in this issue Snapshot Multiple SAN fabrics common How big is your SAN Getting serious about storage resource management tools Times have changed Storage resource management tools once dismissed as hype are becoming more and more useful Smaller storage companies have proven that they can innovate Storage Bin A handful of big companies dominate much of the storage market but some of the smaller guys have proven that they can innovate and have caught the eye of savvy storage managers Why Windows is storage friendly It s time to take Windows storage features seriously Two key technologies Multipath I O and the Volume Shadow Copy Service demonstrate why Windows is much more storage friendly than people think Wanted Better support by Mark Schlack Wanted Better support Previous Issue View Storage magazine Archives Next Issue Latest TechTarget resources CIO AU Security AU Data Backup Disaster Recovery SMB Storage Storage Channel Search CIO com au Internet of Things Food safety apps set to emerge Food safety could emerge as a key application for the Internet of Things in the coming months providing real time visibility in Cloud security culture a building block for today s businesses As organizations today move more data to the cloud it s important to cultivate a cloud security culture and enlist a CISO a new How to do an advanced data analytics project on the cheap Delivering an advanced data analytics project to the business can cost a fortune But with some imagination CIOs can do it for Search Security com au

    Original URL path: http://searchstorage.techtarget.com.au/ezine/Storage-magazine/The-best-high-end-storage-arrays-of-2005/Archiving-unstructured-data (2016-02-10)
    Open archived version from archive


  • Agile design process takes business to the doghouse and the cloud
    to only pay for what they needed as they needed As the business grew they were able to increment the services in parallel and only pay for what they needed In simple terms if they d purchased or operated their own servers they would have under utilised the equipment initially and when they needed the capacity their equipment would have been 12 18 months out of date When 99designs started they provisioned two servers Within 12 months they had grown to needing almost 40 servers Whilst there s still the engineering overhead of building a system which scales neatly being able to just incrementally add on servers or remove them when we re done with them is a dramatically different way of thinking for us says Donald The 99designs business is growing rapidly It s currently serving 22M pageviews per month 30TB in images served per month A design uploaded every 5 seconds While it would seem that the constant growth in number of servers and bandwidth would contribute to a rapid increase in expenses 99designs hasn t found this to be so As their use as increased they have received the benefits of scaled pricing and Amazon reducing the base costs So even though the technical requirements have increased 20 fold the service costs haven t followed the same trajectory Development in the Doghouse 99designs began its life as an outpost from Sitepoint During the initial development of 99designs there was a need to get things started far more quickly than a traditional development model would allow The Doghouse model was created by the Sitepoint team when they needed to quickly get 99designs up and running They began the development session by putting together a small team of designers and developers and putting them in a workspace together so that they could collaborate and work together to quickly put the first version of 99designs together The development session ran for about three weeks and involved long working hours The name Doghouse was coined by staff and it stuck The idea was to keep the feedback loop very short There was a designer there was a developer and there was Mark Harbottle co founder of 99designs So out of that came the first prototype of would become 99designs explains Donald It was about getting smallest possible team together a great designer a great developer and someone with the idea Getting the right people involved is critical as sessions tend to run for long hours and very intense In order to ensure that staff aren t burned out Doghouse sessions are only run by the 99designs teams a couple of times each year The approach works well as the feedback and communication loops are greatly reduced Also part of each Doghouse team is part of the business management so that business decisions can be fast tracked to reduce the typical corporate delays that are part of many development projects Jason Sew Hoy the COO of 99designs says that The key to

    Original URL path: http://searchstorage.techtarget.com.au/news/2240111873/Agile-design-process-takes-business-to-the-doghouse-and-the-cloud (2016-02-10)
    Open archived version from archive

  • NetApp announces Big Data solutions as market grows
    file systems The NetApp High Performance Computing Solution for Lustre is purpose built to efficiently scale bandwidth and density with proven reliability to solve difficult research modelling and simulation problems The solution shortens time to results and increases compute efficiency with a storage solution optimised for the Lustre file system and eliminates data bottlenecks with 30GB sec writes Power and cooling requirements and operational costs are reduced by maximising disk drive performance in less rack space The NetApp Seismic Processing Solution provides efficient access to big data generated by seismic processing operations to help exploration teams make optimal decisions The high density E Series platform supports 1 8PB in each industry standard 40U rack The solution manages high bandwidth seismic processing operations with Quantum s StorNext file system NetApp s approach provides an interesting contrast to that of other vendors who take a different approach NetApp is looking to a storage solution whereas other vendors such as EMC and Oracle are touting a dedicated platform that pulls the storage and analytics into once logical or physical device For example the recently announced Oracle Big Data appliance that includes hardware and software for data acquisition and analysis Similarly EMC is pushing its own dedicated big data platform with the Greenplum Modular Data Computing Appliance A broader look at the Big Data marketplace reveals some interesting insights For example a recent Gartner report titled Magic Quadrant for Data Integration Tools reveals that many of the big enterprise computing players are active in this space with IBM Informatica Oracle and SAP identified as leaders However the companies to watch are Pervasive Software Talend and iWay Software noted as visionaries in Gartner s report Clearly with the volume of unstructured data expected to grow by 800 in the next five years it s clear that NetApp s announcement won t be the last in this growing market space Dig Deeper on Data storage vendors All News Problem Solve SGI adopts hybrid storage for clustered NAS Symantec aims Data Insight 3 0 at unstructured data management IT data storage priorities 2012 Virtualization dedupe and cloud I spy web servers which don t lie Load More View All SGI adopts hybrid storage for clustered NAS Symantec aims Data Insight 3 0 at unstructured data management I spy web servers which don t lie Richard Gere loves EMC so it probably isn t worried about NetApp haters Load More View All News IT data storage priorities 2012 Virtualization dedupe and cloud VNXe the jewel as EMC caned for broken record VNX marketing Sun ZFS 101 View All Problem solve ADS BY GOOGLE Latest TechTarget resources CIO AU Security AU Data Backup Disaster Recovery SMB Storage Storage Channel Search CIO com au Internet of Things Food safety apps set to emerge Food safety could emerge as a key application for the Internet of Things in the coming months providing real time visibility in Cloud security culture a building block for today s businesses As organizations today move more

    Original URL path: http://searchstorage.techtarget.com.au/news/2240110990/NetApp-announces-Big-Data-solutions-as-market-grows (2016-02-10)
    Open archived version from archive

  • News, Analysis and Opinion for Data management software - SearchStorage.com.au
    cartridges as its tape migration Geoscience Australia is tendering for companies to move its old data onto IBM3592 cartridges and expects it will require such services for up to five years April 05 2010 05 Apr 10 Geoscience Australia selects IBM 3592 cartridges as its tape migration extends beyond ten years Geoscience Australia selects IBM 3592 cartridges as its tape migration extends beyond ten years March 09 2010 09 Mar 10 EMC s Ionix sale analysed the industry likes the idea EMC s surprise decision to offload its Ionix product line is welcomed by many but Gartner warns customers need to be wary of EMC s management roadmap February 25 2010 25 Feb 10 CA to sublimate XOsoft brand launch ARCserve 15 in April CA is planning a relaunch of its storage portfolio later this year along with more partners using its products to deliver backup as a service February 02 2010 02 Feb 10 CommVault connects Simpana to the cloud CommVault s Simpana can now dump your data into cloud storage services from Amazon EMC Iron Mountain Microsoft and Nirvanix Updated with analyst reaction from Gartner and ESG December 14 2009 14 Dec 09 Symantec CEO Enrique Salem hints at unstructured data management tools Symantec s CEO has hinted at new tools to ease management of unstructured data Thin provisioning management tools may also be on the company s agenda November 17 2009 17 Nov 09 Gartner Storage management software immature Gartner s annual Symposium in Sydney Australia has been told that storage management software lacks the sophistication users need to effectively manage storage sprawl August 22 2009 22 Aug 09 Fibre Channel director face off Brocade vs Cisco Part 1 Brocade s 48000 Director and Cisco s MDS 9513 Multilayer Director offer different paths to storage services and consolidation options Which company offers the best director for your storage environment August 05 2009 05 Aug 09 Q A Best backup reporting tools Backup Guru Curtis Preston lists his favorite backup reporting tools and explains why you would consider these products July 09 2009 09 Jul 09 With one eye on the cloud EMC enters enterprise management market EMC has rebranded and revamped its management products labelled them Ionix and believes it can beat the likes of HP CA and IBM s enterprise management efforts 1 2 3 ADS BY GOOGLE Latest TechTarget resources CIO AU Security AU Data Backup Disaster Recovery SMB Storage Storage Channel Search CIO com au Internet of Things Food safety apps set to emerge Food safety could emerge as a key application for the Internet of Things in the coming months providing real time visibility in Cloud security culture a building block for today s businesses As organizations today move more data to the cloud it s important to cultivate a cloud security culture and enlist a CISO a new How to do an advanced data analytics project on the cheap Delivering an advanced data analytics project to the business can cost a fortune

    Original URL path: http://searchstorage.techtarget.com.au/info/news/Data-management-software (2016-02-10)
    Open archived version from archive

  • 11 Tivoli Storage Manager tips
    seriously affect tape media utilization and must therefore be used wisely System level collocation for small systems that do not have enough data to ever fill high capacity tape media is a poor practice Picture a 10 GB Web server backing up to its own 800 GB LTO 4 tape This tape volume would occupy a library slot without ever reaching a significant utilization percentage It should be noted that replacing tape storage with VTL or disk deduplication technology can significantly reduce the negative impact of both backup data fragmentation and small systems collocation on tape pools 5 Backup data change rate and retention The rate at which data changes and how long backup data is retained are the most important factors to consider for TSM capacity planning Data change rate has a direct impact on the volume of daily backup data which in turn dictates network bandwidth and the overall performance requirement of the backup infrastructure However the number of backup copies kept versions and how long they are retained has a direct influence on the backup data storage capacity disk tape or VTL Versioning and retention should be defined based on business recovery requirements rather than convenience and nice to have Other than retention parameters imposed by regulatory compliance requirements best practice is to define modest retention settings at first and increase as necessary rather than starting large without ever knowing if it is too much 6 Policy domains and management classes in Tivoli Storage Manager This is definitely an area where things can become complex and difficult to manage At a high level Tivoli Storage Manager manages systems backup schedules storage destination and backup data retention based on logical groupings known as policy domains and management classes Unless there are specific business reasons to treat backup data differently in terms of where it is stored and how long it is to be retained it is preferable to keep the number of policy domains management classes and backup schedules to a minimum for simplicity Too many policies make the environment overly complex difficult to manage and error prone 7 Client option sets The TSM backup clients can be configured to take advantage of numerous settings and options that reflect site specific backup policies or apply to specific systems While the TSM clients all depend on a local options file dsm opt for basic settings such as the TSM server IP address it is a good practice to create client option sets on the TSM server for system or group specific configuration options In large environments where there are a large number of TSM nodes clients it is a lot easier to centrally manage client option sets rather than many individual options files spread out across the environment 8 TSM database and logs The TSM database must be backed up daily ideally it should be backed up twice a day with one copy sent offsite and the other kept onsite for rapid restores If roll forward logging mode is

    Original URL path: http://searchstorage.techtarget.com.au/tip/11-Tivoli-Storage-Manager-tips (2016-02-10)
    Open archived version from archive

  • Capacity planning: Reclaim orphaned storage
    been released after some previous use The data or storage may not have been de allocated if someone forgot to tell someone else that the storage is no longer being used or that some documentation somewhere was not updated to indicate that the storage can be de allocated and reprovisioned Another cause of orphaned storage is the result of system or application error For example over a period of time inconsistencies can appear in databases or file systems requiring a repair operation to free up unused yet allocated storage and index pointers An example in Microsoft Windows is the chkdsk command which can be used to identify and repair file system inconsistencies including orphaned files or disk storage UNIX based systems can use the fsck command to identify and repair file system inconsistencies Tools to help detect orphaned storage are available from operating system application and database vendors along with third party vendors including EMC HP IBM Microsoft Monosphere Novus NTP Oppsware and Quest among others Consider the following eight items for finding and eliminating or adopting orphaned storage Clean up temporary scratch and work space on a regular basis Run database application specific and file system consistency checks Utilize vendor tools or have your vendor check for orphaned devices Leverage discovery and SRM tools to verify how storage is being used Use configuration analysis tools to validate storage configurations Look for files that appeared around the time a system or application error occurred Have your DBAs check for duplicate data orphaned rows or tables in databases Institute policies and checklists as part of a clean up after an application operating system server or storage upgrades to help catch and prevent orphaned storage Orphaned storage is a problem if ignored because extra storage capacity and I O performance are consumed to perform scans for backup virus detection and other functions Finding and eliminating or adopting orphaned data storage requires a time investment including the use of tools to help discovery and analysis data The time invested in looking for abandoned and un utilized storage can help provide additional storage capacity Take a look at your systems and see how much orphaned data storage you can find Unless you are already aggressively looking for and have performed regular data and storage reclamation cleanup I suspect that you will find some orphaned data storage Do you know How to rescue stranded storage About the author Greg Schulz is founder and senior analyst with the IT infrastructure analyst and consulting firm StorageIO Greg is also the author and illustrator of Resilient Storage Networks Elsevier and has contributed material to Storage magazine and other TechTarget venues This was first published in October 2006 Dig Deeper on Data management software All News Problem Solve Storage infrastructure management doesn t have to be an infrastruggle Firms pleased with Red Hat Gluster as storage investment increases Geoscience Australia selects IBM 3592 cartridges as its tape migration Geoscience Australia selects IBM 3592 cartridges as its tape migration extends

    Original URL path: http://searchstorage.techtarget.com.au/tip/Capacity-planning-Reclaim-orphaned-storage (2016-02-10)
    Open archived version from archive

  • Archive or backup?
    a series of weekly full backups followed by daily incremental backups that are kept for a predetermined amount of time i e 30 days In order to keep a copy for a longer period than usual an out of sequence copy must be created That is a copy that is not associated with the 30 day retention in our example This is where the attributes of an archive start to take shape We can think of an archive as an out of sequence copy a copy that is not associated with other copies for retention purposes i e full and incremental Let s look at other attributes that should differentiate an archive from a backup object Archives should not be retained simply based on the number of existing copies Each archive should be a unique object bearing a time stamp descriptor and a retention parameter We typically backup data to protect it from being lost or altered and because it must remain readily available it would therefore go against the rules to delete a file after backing it up Conversely data is often archived so it can be deleted from its original location because immediate access is no longer required Archived data can be extracted from its original context and catalogued or indexed for later retrieval This is the case for CAS or email archiving products where a message or attachment is taken out of its usual structure and stored elsewhere As a general rule we can go back to the days of paper records and draw a parallel with today s backups and archives Back then records were typed or handwritten and carbon copies or photocopies were used for backups When a document lost some of its daily business relevance but still had to be retained is was taken out of the filing cabinet put into a cube box and sent to some basement or warehouse to be kept as an archive That said this is pretty much where the similarities end We don t have a problem reading a paper document that was archived 50 years ago the same cannot be said about electronic archives In closing and without trying to oversimplify things if a record is copied for protection we can probably call it a backup If the same record it stored on some media with particular concern with immediate access it s probably safe to call it an archive About the author Pierre Dorion is a certified business continuity professional for Mainland Information Systems Inc This was first published in July 2006 Dig Deeper on Data management software All News Problem Solve Storage infrastructure management doesn t have to be an infrastruggle Firms pleased with Red Hat Gluster as storage investment increases Geoscience Australia selects IBM 3592 cartridges as its tape migration Geoscience Australia selects IBM 3592 cartridges as its tape migration extends beyond ten years Load More View All Storage infrastructure management doesn t have to be an infrastruggle Firms pleased with Red Hat Gluster

    Original URL path: http://searchstorage.techtarget.com.au/tip/Archive-or-backup (2016-02-10)
    Open archived version from archive

  • Troubleshoot and Solve Data management software Problems - SearchStorage.com.au
    accelerates access to student and business services with Case Study Data management software 11 Tivoli Storage Manager tips Get the best from Tivoli Storage Manager and learn how to troubleshoot common TSM problems with this collection of eleven tips Continue Reading Capacity planning Reclaim orphaned storage This tip takes a look at how to maximize your storage capacity utilization by finding orphaned data and storage that can in turn be re allocated for another productive use Continue Reading Archive or backup How long does a backup object have to be retained before it is considered an archive Or is it actually not just a simple matter of time This tip presents information that will help draw your own conclusion on a not so cut and dried subject Continue Reading Archiving unstructured data Companies must find ways to automate and simplify the process of archiving files and e mail messages ECM software addresses this large pool of unstructured data Continue Reading Capacity planning in the enterprise Capacity planning is becoming an increasingly important way to balance future hardware costs with computing needs Continue Reading ADS BY GOOGLE Latest TechTarget resources CIO AU Security AU Data Backup Disaster Recovery SMB Storage Storage Channel Search CIO com au Internet of Things Food safety apps set to emerge Food safety could emerge as a key application for the Internet of Things in the coming months providing real time visibility in Cloud security culture a building block for today s businesses As organizations today move more data to the cloud it s important to cultivate a cloud security culture and enlist a CISO a new How to do an advanced data analytics project on the cheap Delivering an advanced data analytics project to the business can cost a fortune But with some imagination CIOs can do it for Search Security com au Weigh the benefits risks of going mobile in the enterprise Smartphone and tablet use in the enterprise can help boost productivity improve employee satisfaction and lower costs However Conquer BYOD risks with mobile device management Bring your own device has a number of inherent security challenges Expert Lisa Phifer explains how a mobile device management The IT security gold rush is on Security budgets in Australia are soaring as hacking incidents reach new levels Search Data Backup Data protection technologies Experts offer predictions Predictions for data protection in 2016 include the resurgence of tape rise of copy data management special optimized options Veeam Availability Suite to get physical complement Veeam CEO Ratmir Timashev talks physical backup the shifting enterprise competitive landscape and why he has no plans to take Arcserve Unified Data Protection upgrade supports instant recovery The new version of Arcserve UDP provides near zero RTOs and RPOs The upgrade also supports snapshots and extends the product s Search Disaster Recovery Druva Phoenix rises as disaster recovery service Druva launches disaster recovery as a service for its Phoenix backup The software pushes data images to AWS cloud so instances 15 field proven

    Original URL path: http://searchstorage.techtarget.com.au/info/problemsolve/Data-management-software (2016-02-10)
    Open archived version from archive



  •