Page 3 of 3 FirstFirst 123
Results 31 to 44 of 44
  1. #31
    New Lounger
    Join Date
    Jul 2012
    Location
    North Vancouver, BC, Canada
    Posts
    10
    Thanks
    0
    Thanked 7 Times in 4 Posts
    Quote Originally Posted by satrow View Post
    My preference for Intel SATA chipset drivers is the default Microsoft W7 and later, it does use some of the Intel drivers, allows TRIM on SSDs, etc., and can be faster than the IntelRST drivers (which may not be the 'correct' or best drivers anyway, RST primarily being for SSD caching, if my understanding is correct).

    Marvell/JRaid etc. chipsets, I don't connect anything to and disable them in the BIOS.

    I've seen PerfectDisk triggering BSODs when set to defrag at boot time and IntelRST drivers also implicated in causing BSODs.

    I haven't seen PerfectDisk trigger a BSOD at boot defrag. I have seen PerfectDisk trigger a BSOD at shutdown - which was not data-destructive. The only other thing I have seen regarding PerfectDisk at startup is what happens if the affected Hard Disk needs to run Chkdsk. PerfectDisk's Boot Defrag has a built-in integrity checker - which runs before the Boot-Defrag runs. If this integrity check fails - an abbreviated Boot-Defrag runs instead of the "full-monty" version.

    If this happens (and it did to me several times on Intel D875PBZ motherboards with ICH5R controllers when running WXP) - running a proper Chkdsk /r on the affected partition - to diagnose the disk and schedule a Chkdsk /r boot-run if necessary - cleaned up the problem. After that, a PerfectDisk boot-defrag ran properly and all was well.

    Note: All the above experience references PerfectDisk Version 12.5 - Build 312 with Hotfix 4 applied - OR - PerfectDisk Version 13.0 - Build 776 (latest of which I am aware). I have not had extensive personal experience with versions of PerfectDisk earlier than that. However my research before purchasing the product indicates that users trying to get by with using earlier versions of PerfectDisk or Ultimate Defrag - when using Intel or 3rd-party Hard Disk Controllers/Drivers - are taking avoidable risks.


    There is absolutely nothing wrong with using the Marvell controller on the Asus P5Q/P5Q3 series motherboards. I have mine set up as a RAID1 Array for my OS Drive - which is partitioned into 3 pieces (one for the OS, one for Data, one as a Scratch Drive for things like Photoshop). If you are avoiding the use of these extra SATA Ports because the "jungle drums network" thinks they're evil - you're missing out on a perfectly-acceptable opportunity to add extra Hard Disks to your system - on the basis of horror stories from people who have problems with these chipsets. IMO, these people have chosen to blame the chipset - instead of admitting they didn't do their homework regarding the required OROM and Driver Support. Google this. There's lots of info out there on how to make this stuff work properly - with people sharing their experience with these chipset OROMs and Drivers.

    Note: You seem perfectly willing to accept the risks inherent with the Intel Chipset OROMs - and even accept the risk of using Intel RST Drivers in some circumstances. Yet you categorically reject the need to perform the same due-diligence in regards to the Marvell/JMicron chipset OROM and Driver categories. This doesn't make sense.


    Simply apply the same logic required for the Intel RST Drivers to the Marvell/JMicron chipsets - and they become available to you as well. Yes, getting this stuff working initially is a pain - there is a learning curve. But that's what Motherboard Manuals, Motherboard Support Forums and Norton System Recovery (the replacement for Ghost) are for. I've played with this extensively - had endless problems with Windows-startup-failures after Marvell Driver updates - as well as return-from-sleep problems with the various Marvell Drivers available. Regardless, 10 minutes with NSR gets me out of whatever jam I get into while experimenting - and each time something goes wonky I learn more about the "bigger picture" in regards to Intel/Marvell OROM/Driver integration.

    Note: I've also found that Microsoft dip their fingers into this mix as well - since "things that didn't work before" in regards to Intel/Marvell Hard Disk Controller Driver integration - suddenly "work properly" after Patch Tuesday some months. Those NT Kernel updates buried in the "security updates" solve more problems than Microsoft admit publicly. Ditto for the various "DotNet" fixes.


    Then - on a related topic - there's the whole USB3 firmware/driver debacle with the various USB3 chipsets - and their impact on External Hard Disk compatibility when used for Backup purposes. This affects many many many Laptops using USB3 - as well as Desktops with native USB3 or USB3 Add-on PCI or PCI-E cards. Again it's almost always solvable - when the correct firmware updates for the USB3 chipset are installed - and the correct updated drivers are used. The exact same due-diligence routine is required here - as for the Marvell/JMicron Hard Disk Controller Drivers mentioned above.


    My $0.02

  2. #32
    Super Moderator satrow's Avatar
    Join Date
    Dec 2009
    Location
    Cardiff, UK
    Posts
    2,138
    Thanks
    102
    Thanked 208 Times in 181 Posts
    Quote Originally Posted by twixt View Post
    ... snip ...
    "jungle drums network", interesting.

    I only use 3 internal drives, I have no use for more; minidumps don't take up much drive space.


    dmps.jpg

  3. #33
    Super Moderator bbearren's Avatar
    Join Date
    Dec 2009
    Location
    Polk County, Florida
    Posts
    2,448
    Thanks
    14
    Thanked 251 Times in 199 Posts
    FWIW

    Metadata in NTFS is stored in the MFT, which is a reserved space on the HDD. Files themselves can sometimes be written within the MFT. Is the metadata defragged when the HDD is defragged? The MFT knows the filename and attributes and where all the file fragments are stored on disk. If a file has 13 fragments, there are 26 entries concerning those locations; beginning of fragment/end of fragment. If the file is defragged into a single contiguous file, there are only two entries concerning the file location; beginning of file/end of file. 24 now useless bits of information have been eliminated from the MFT on one file.

    I've been using MyDefrag since is was JKDefrag, and I'm still using it. MyDefrag does more than just re-write files into contiguous space. It can be run in the background while you're using your PC, or it can be scheduled to run with Windows Task Scheduler. It can be safely stopped at any time. It uses the Windows prefetch logs to optimize file placement. If it finds files written within the MFT, it moves them out of that reserved area. For more information, check MyDefrag.

    Here is some more information on file systems, with some more things to worry about. Another way I look at defragging my HDD is that I'm actively refreshing my files little by little.

    I have the Windows background defragger and prefetch updater disabled, and use MyDefrag scheduled tasks to take care of that for me. I run a "daily defrag" (actually nightly), and a "monthly defrag". Do I notice any increase in system performance? No. Do I notice the so-called "Windows bloat" and performance slowdown? Nope. I have a youtube video of this PC booting Windows 8. In the video, I'm booting to the lock screen of Windows 8. You'll notice the date is January 5 - that was 2013. The video runs from the time my boot manager loads (immediately after the BIOS) and the OS to boot is selected. The video runs 31 seconds, OS selection is about 3 - 5 seconds of that.

    I just timed a fresh boot to the lock screen from OS selection in my boot manager - 26 seconds. Has defragging speeded up my PC? No, not noticably. But it certainly hasn't slowed it down. Programs launch and everything works as crisply now as it always has. YMMV
    Last edited by bbearren; 2014-01-31 at 15:31. Reason: clarity, grammar
    Create a new drive image before making system changes, in case you need to start over!

    "Let them that don't want it have memories of not gettin' any." "Gratitude is riches and complaint is poverty and the worst I ever had was wonderful." Brother Dave Gardner "Experience is what you get when you're looking for something else." Sir Thomas Robert Deware. "The problem is not the problem. The problem is your attitude about the problem. Do you understand?" Captain Jack Sparrow.
    Unleash Windows

  4. #34
    New Lounger
    Join Date
    Jul 2012
    Location
    North Vancouver, BC, Canada
    Posts
    10
    Thanks
    0
    Thanked 7 Times in 4 Posts
    Quote Originally Posted by bbearren View Post
    FWIW

    Metadata in NTFS is stored in the MFT, which is a reserved space on the HDD. Files themselves can sometimes be written within the MFT. Is the metadata defragged when the HDD is defragged? The MFT knows the filename and attributes and where all the file fragments are stored on disk. If a file has 13 fragments, there are 26 entries concerning those locations; beginning of fragment/end of fragment. If the file is defragged into a single contiguous file, there are only two entries concerning the file location; beginning of file/end of file. 24 now useless bits of information have been eliminated from the MFT on one file.

    I've been using MyDefrag since is was JKDefrag, and I'm still using it. MyDefrag does more than just re-write files into contiguous space. It can be run in the background while you're using your PC, or it can be scheduled to run with Windows Task Scheduler. It can be safely stopped at any time. It uses the Windows prefetch logs to optimize file placement. If it finds files written within the MFT, it moves them out of that reserved area. For more information, check MyDefrag.

    Here is some more information on file systems, with some more things to worry about. Another way I look at defragging my HDD is that I'm actively refreshing my files little by little.

    I have the Windows background defragger and prefetch updater disabled, and use MyDefrag scheduled tasks to take care of that for me. I run a "daily defrag" (actually nightly), and a "monthly defrag". Do I notice any increase in system performance? No. Do I notice the so-called "Windows bloat" and performance slowdown? Nope. I have a youtube video of this PC booting Windows 8. In the video, I'm booting to the lock screen of Windows 8. You'll notice the date is January 5 - that was 2013. The video runs from the time my boot manager loads (immediately after the BIOS) and the OS to boot is selected. The video runs 31 seconds, OS selection is about 3 - 5 seconds of that.

    I just timed a fresh boot to the lock screen from OS selection in my boot manager - 26 seconds. Has defragging speeded up my PC? No, not noticably. But it certainly hasn't slowed it down. Programs launch and everything works as crisply now as it always has. YMMV

    Some things to note:

    There are many pieces of Metadata that are not carried in the MFT.

    The complete list is as follows: http://ntfs.com/ntfs-system-files.htm

    Of the total list, there are some which cannot be defragmented while the OS is up and running. For these a Boot-Defrag is necessary. Any Defragger which does not offer a Boot-Defrag option does not defrag the affected metadata files - which end up strewn all over the Hard Disk as Windows adds/deletes files.

    Yes, the $Mft entries can be defragged "on the fly". No, these are not all the items each file "messes with" when created/deleted.



    Metadata "quirks":

    The most common Metadata file you will see Chkdsk fiddling with is the $Bitmap file. When "lost clusters" are found - this is the file that must be rewritten so the free-space on the drive that NTFS "thinks is there" - corresponds with what "actually is there". Systems where this value is corrupt will give "out of disk space" errors when there is lots of free space available - because NTFS is querying the $Bitmap metadata rather than the actual free space. Running Chkdsk /f brings theory and reality back into sync - along with fixing a bunch of other "sins".

    Furthermore, when any new file-entry is created - the journaling system in NTFS creates a $LogFile entry for each transaction. This allows system crashes to be stably recovered in situations where FAT would fall over completely. The presence of the NTFS journaling system is the main reason an OS Crash on a System running NTFS tends to recover transparently to the user on restart - moreso than systems running the FAT file system. In fact, this was one of the prime design critera for NTFS as the replacement for FAT.

    See the following for more info: http://www.ntfs.com/transaction.htm


    Journal entries are usually created at the beginning of a segment of the hard disk where a set of file-operations is going to occur. However - once created - that $LogFile entry will stay at that location forever - until the Hard Disk is either reformatted or a Boot-Defrag is performed.

    To see an example of this, use the old Speed Disk found in Norton SystemWorks - on an NTFS-based WXP System which has been around for a while. After a full defrag using Speed Disk or any other Windows-based Defragger without a Boot-Defrag option - you will notice a whole bunch of little orange squares strewn all over the Speed Disk map. These are $LogFile fragments - which Speed Disk, Windows Defrag, and all other Defraggers that do not have a Boot-Defrag option cannot consolidate. Thus, every time you write new files to a NTFS volume - you refragment your files as they "skip around" those $LogFile fragments. This is utterly unavoidable - it is inherent to the design of NTFS.

    Note: If your Defragger is not showing you this - it is lying to you in order to "look good" - while not actually doing a "full-pull" job.


    Thus, the only way to fully defragment an NTFS volume is to properly consolidate all the $Metadata files on the Hard Disk - which currently can only be done during a Boot-Defrag operation.

    Note: Even copying-out/reformatting/copying-in does not truly solve the problem - as the $LogFile data is recreated during the copy-in process and refragmentation starts anew as soon as file system operations occur during normal Windows operation. This can be verified by running PerfectDisk or Ultimate Defrag after a copy-out/format/copy-in operation. Even after a full defrag of the newly-restored system - a Boot-Defrag will still find lots of things to consolidate on its first run - due to the nature of how NTFS stores its metadata - and how it handles metadata capacity-expansion requirements.



    Visualizing how Metadata and the MFT interact:

    Ultimate Defrag allows the user to reposition not only the consolidated MFT/Metadata "block" - but also to sort the consolidated Metadata inside the "block" into whatever order the user wishes. This is a great way to visualize the interaction between the MFT and its associated Metadata.

    Note: Both PerfectDisk and Ultimate Defrag also allow the user to see the individual Metadata "blocks" - that reappear spontaneously during normal NTFS operations after a PerfectDisk or Ultimate Defrag pass - which cannot be reconsolidated back into the consolidated MFT/Metadata block until a Boot-Defrag is rerun. Such is the tao of an honest defragger.



    Final observations:

    It is no accident that people running Exchange Server - which is especially prone to NTFS fragmentation problems bringing Email performance to its knees - commonly use a special form of PerfectDisk to ensure their Corporate Email Systems run with acceptable speed. IMO, the above-described limitations in NTFS make that requirement self-evident.

    TANSTAAFL.

  5. The Following 2 Users Say Thank You to twixt For This Useful Post:

    ruirib (2014-02-01),scaisson (2014-02-01)

  6. #35
    4 Star Lounger
    Join Date
    Mar 2011
    Posts
    579
    Thanks
    4
    Thanked 35 Times in 30 Posts
    If you are using any third-party defragger, you should turn off the (default) Windows scheduled defragger first. If they use different algorithms, they may be working at cross-purposes and waste a lot of computing time, not to mention drive life.

    Your defragging principles and software should match the technology of the drive and the software. Drive technology is getting smarter all the time (SMART and all that). If you replace an existing drive with a later drive, you should do the homework and you and your software should adapt to the newer technology. (Do as I say and not as I do.) Check the reviews; I ran across a few recently (and may append links if I can find them).

    There has been no mention in this thread of so-called optimising, which is to say arranging the relative location of files on the drive into related groups that allows faster access to the most-likely needed files relative to the others, irrespective of fragmentation. MyDefrag, mentioned in the thread, is capable of this. So, for that matter, is System Mechanic.

    I think tape was before 8-inch drives. You can see it on any Sci-Fi TV program with a mainframe in the background, and the spools never budge by a degree, which suggests there is no reading or writing going on. I have no comment on certain other historical matters.

    Editing already: BBearren did allude to the ability of MyDefrag to optimize (without using that word); the 'Go Sideways' reference reminded me to remark on drive errors, which are one quick way for everything to Go South for no apparent reason, and which should be the first thing to check before messing things up beyond repair; I have before me a one GB Verbatim flash drive for which the native format is FAT.
    Last edited by dogberry; 2014-02-01 at 00:45.

  7. #36
    Super Moderator bbearren's Avatar
    Join Date
    Dec 2009
    Location
    Polk County, Florida
    Posts
    2,448
    Thanks
    14
    Thanked 251 Times in 199 Posts
    Quote Originally Posted by twixt View Post
    Final observations:
    Exchange Server is not an issue for me. Running Windows 8 a full year on a moderate, low-end system (Dell Inspiron 580 with Intel Core i3 CPU) with absolutely no loss in performance, no BSOD's, no issues of any kind tells me a little bit. My system has three 1TB drives with 19 partitions (two are hidden) spread across them.

    Prior to the year of flawless, faultless performance with Windows 8, I ran the same system for two years dual booting two versions of Windows 7 with the same flawless, faultless, performance with no fall off. Windows 7 Ultimate has been running 3 years on this machine (I upgraded Windows 7 Home Premium to Windows 8). That sort of experience is all the information I really need to know to tell me that the tools I'm using and what I'm doing is working just fine for me. I see no viable reason to change anything that I'm doing.

    YMMV
    Create a new drive image before making system changes, in case you need to start over!

    "Let them that don't want it have memories of not gettin' any." "Gratitude is riches and complaint is poverty and the worst I ever had was wonderful." Brother Dave Gardner "Experience is what you get when you're looking for something else." Sir Thomas Robert Deware. "The problem is not the problem. The problem is your attitude about the problem. Do you understand?" Captain Jack Sparrow.
    Unleash Windows

  8. #37
    Super Moderator bbearren's Avatar
    Join Date
    Dec 2009
    Location
    Polk County, Florida
    Posts
    2,448
    Thanks
    14
    Thanked 251 Times in 199 Posts
    Quote Originally Posted by dogberry View Post
    Editing already: BBearren did allude to the ability of MyDefrag to optimize (without using that word).
    Actually, I did use that word:
    Quote Originally Posted by bbearren View Post
    It uses the Windows prefetch logs to optimize file placement. If it finds files written within the MFT, it moves them out of that reserved area. For more information, check MyDefrag.
    From the MyDefrag site (emphasis mine):

    "MyDefrag organizes files into zones, such as directories, Windows files, files used while booting, regular files, and rarely used files. The most accessed files are placed at the beginning of the harddisk, and files that are commonly used together are placed in close proximity to each other. This results in a dramatic speed increase, and is in fact more important than defragmentation. The program comes with scripts with a zone organization suitable for most users, power users can customize the zones through scripts."

    Another neat optimization trick that MyDefrag performs is to allocate plenty of empty space before and after $Logfile so that, even though the file is immovable, it can grow without becoming fragmented. Each run of MyDefrag will analyze and adjust that empty space around $Logfile, so that there's always room to grow without fragmentation of the file.

    Similarly, MyDefrag puts empty space between its zones to allow room for temporary files to be written in close proximity to the program/utility/function that is creating them.

    FWIW, I just timed a fresh start on my 3 year old Windows 7 Ultimate dual boot; from boot manager OS selection to lock screen was 30 seconds. I don't have a youtube video for comparison, but to the best of my recollection, that's about what it was in January 2010, give or take a second or two.

    I used \Windows\System32\Defrag.exe to analyze my Windows 7 Ultimate OS partition (MyDefrag uses the Windows defrag API to do all its file manipulations) and got the following report, which I think speaks for itself as to how well MyDefrag does its job for me.


    defrag analysis.PNG

    And as always, YMMV.
    Last edited by bbearren; 2014-02-01 at 11:16. Reason: added graphic
    Create a new drive image before making system changes, in case you need to start over!

    "Let them that don't want it have memories of not gettin' any." "Gratitude is riches and complaint is poverty and the worst I ever had was wonderful." Brother Dave Gardner "Experience is what you get when you're looking for something else." Sir Thomas Robert Deware. "The problem is not the problem. The problem is your attitude about the problem. Do you understand?" Captain Jack Sparrow.
    Unleash Windows

  9. #38
    3 Star Lounger
    Join Date
    Mar 2010
    Location
    USA
    Posts
    258
    Thanks
    49
    Thanked 32 Times in 25 Posts
    Thanks for all the inputs. Very educational, and lots of information to digest.
    Re: Copy_out-reformat-copy_back and yet still has something to defrag after that. I'm confused.
    I thought after a format, the hard drive is 'raw': no data. NTFS formatting is different? Still has ghostly trace left behind for defrag? What if I first format to FAT/exFAT, then again format it to NTFS? Even first partition differently then re-do partitioning. Still ghostly trace left behind?
    Please discuss and advise. Much appreciated.

  10. #39
    Super Moderator bbearren's Avatar
    Join Date
    Dec 2009
    Location
    Polk County, Florida
    Posts
    2,448
    Thanks
    14
    Thanked 251 Times in 199 Posts
    Quote Originally Posted by scaisson View Post
    I'm confused.
    I thought after a format, the hard drive is 'raw': no data.
    Before a format, the hard drive is 'raw'. Formatting lays down the MBR, reserves space for the MFT, marks off the sectors and tracks, creates the bitmap that records the drive layout, and a few other odds and ends. That's what formatting does.

    A housing development begins with the streets and utilities and surveyed lots before the house building ever starts.
    Create a new drive image before making system changes, in case you need to start over!

    "Let them that don't want it have memories of not gettin' any." "Gratitude is riches and complaint is poverty and the worst I ever had was wonderful." Brother Dave Gardner "Experience is what you get when you're looking for something else." Sir Thomas Robert Deware. "The problem is not the problem. The problem is your attitude about the problem. Do you understand?" Captain Jack Sparrow.
    Unleash Windows

  11. #40
    New Lounger
    Join Date
    Jul 2012
    Location
    North Vancouver, BC, Canada
    Posts
    10
    Thanks
    0
    Thanked 7 Times in 4 Posts
    Quote Originally Posted by dogberry View Post
    If you are using any third-party defragger, you should turn off the (default) Windows scheduled defragger first. If they use different algorithms, they may be working at cross-purposes and waste a lot of computing time, not to mention drive life.

    Your defragging principles and software should match the technology of the drive and the software. Drive technology is getting smarter all the time (SMART and all that). If you replace an existing drive with a later drive, you should do the homework and you and your software should adapt to the newer technology. (Do as I say and not as I do.) Check the reviews; I ran across a few recently (and may append links if I can find them).

    There has been no mention in this thread of so-called optimising, which is to say arranging the relative location of files on the drive into related groups that allows faster access to the most-likely needed files relative to the others, irrespective of fragmentation. MyDefrag, mentioned in the thread, is capable of this. So, for that matter, is System Mechanic.

    I think tape was before 8-inch drives. You can see it on any Sci-Fi TV program with a mainframe in the background, and the spools never budge by a degree, which suggests there is no reading or writing going on. I have no comment on certain other historical matters.

    Editing already: BBearren did allude to the ability of MyDefrag to optimize (without using that word); the 'Go Sideways' reference reminded me to remark on drive errors, which are one quick way for everything to Go South for no apparent reason, and which should be the first thing to check before messing things up beyond repair; I have before me a one GB Verbatim flash drive for which the native format is FAT.

    When PerfectDisk is installed, the installer options offer 3 different ways for PerfectDisk to handle "Windows Optimization". One of the methods is to continue allowing Windows to perform its own "Optimization" process undisturbed. The second is to allow PerfectDisk to "take over" these operations. The third is to have the system not perform "Optimization" at all. The user has the choice of selecting whichever pattern they wish.

    If the user opts to allow PerfectDisk to perform this optimization, the PerfectDisk Defrag process will use the data gathered by Windows for its own "Optimizer" - to perform a similar optimization process - except that PerfectDisk normally consolidates this data to the very front of the disk - where the data transfer rate is fastest. This allows Windows to start up in the shortest amount of time possible. PerfectDisk also marks the blocks on the disk involved with startup in a special colour in its disk map - so the user can see how fragmentation affects the bootup process. Similar options are offered with Ultimate Defrag - the difference being the user has full control of placement and priority for these files - rather than following the defragger's "pattern" without the ability to customize as desired.

    Note: Every month when Patch Tuesday rolls around - and every time the user installs/updates a program on their machine - the "Windows Optimization" information becomes obsolete. Thus the need for a Defrag pass to regain startup-efficiency. This is normal - and a standard consequence of how Windows operates. All modern Defraggers are aware of this limitation - and automatically react to reconsolidate the startup-fileset when the fileset is disturbed by changes which occur due to updates. There is nothing special about PerfectDisk, Ultimate Defrag or MyDefrag in this situation - even the built-in Windows Defragger performs this operation. What does change with various different defraggers is where that reconsolidated information is repositioned on the Hard Disk.


    As bbearren pointed out, an important part of modern Defragmenter operations is to "zone allocate" the total fileset on the Hard Disk - such that typical file-operations not only work with contiguous files - but also have those files situated close to each other on the Hard Disk. As he mentioned - this markedly improves the system's speed-of-response - both when loading programs and when loading datafiles. Pretty well all modern Defraggers do this. Competition in the Defragger field has led to improvements in Defragger efficiency - including the use of "zone allocation" - becoming standard "checkbox items" in any modern Defragger's feature list.


    The intent of Boot-Defrag in Defraggers that offer this option - is to avoid the need to do a copy-out/reformat/copy-in operation (or an OS reinstall) every 6-months to a year - in order to force the reconsolidation of $Logfile data.

    Note: One of the reasons a fresh install of Windows tends to be faster than one which has been in use for a while - is precisely because the $Logfile data is less fragmented. Defraggers that offer Boot Optimization allow the user to avoid the need to perform a copy-out/reformat/copy-in operation (or an OS reinstall) to regain that performance.


    Both PerfectDisk and Ultimate Defrag are Hard Disk "SMART" aware. PerfectDisk has an explicit set of screens which report the Hard Disk SMART attributes and allow the user to view the SMART data if desired - so the user can "keep an eye" on their Hard Disk as it operates - rather than just rely on the SMART system to tell the user the disk is in imminent danger of failure. However, there are many other programs which also allow the user to view this information (such as SpeedFan) - so I don't consider the presence/absence of this option to be a dealbreaker.


    My experience is that Defraggers (either built into Windows or third-party) divide themselves into two groups. The first group does all the things mentioned by bbearren in his post about myDefrag. The second group does all the things mentioned by bbearren in his post about myDefrag - as well as the items I've mentioned which are performed by Defraggers that support Boot-Defrag operations.

    IMO, the use of a Defragger that supports a Boot-Defrag option is sort of like deciding to go to the Dentist for an annual cleaning. Is it absolutely necessary? No. But in the same way the cleaning also allows the Dentist the opportunity to check for "tooth rot" - the regular use of a Defragger with a Boot-Defrag option offers the user the same opportunity to prevent Hard Disk "bit rot" - without having to go through the procedure of removing all the user's teeth and then allowing them to grow in again (copy-out/reformat/copy-in operation).

    In Computer terms - it's not as if the choice to use of one or the other of the Defragger types mentioned above is mandatory. It's that one method requires a tedious procedure to be performed every once in a while (copy-out/reformat/copy-in or OS reinstall). The other obviates that need - at the cost of a one-time expense to purchase a Defragger with Boot-Time Defrag capability.

    With the full knowledge of the advantages and limitations of either approach - the user can then choose which option to exercise - on the basis of full knowledge of the advantages/disadvantages of each.


    Hope this helps.

  12. #41
    Super Moderator bbearren's Avatar
    Join Date
    Dec 2009
    Location
    Polk County, Florida
    Posts
    2,448
    Thanks
    14
    Thanked 251 Times in 199 Posts
    Quote Originally Posted by twixt View Post
    In Computer terms - it's not as if the choice to use of one or the other of the Defragger types mentioned above is mandatory. It's that one method requires a tedious procedure to be performed every once in a while (copy-out/reformat/copy-in or OS reinstall). The other obviates that need - at the cost of a one-time expense to purchase a Defragger with Boot-Time Defrag capability.
    In my experience, I have found no need (nor justification) whatsoever for "copy-out/reformat/copy-in or OS reinstall". For me, it's the-monster-under-the-bed/windows-needs-a-yearly-reinstall boogeyman - it simply doesn't line up with my reality and experience accumulated over many years and several systems, including a few DIY rigs, and a great many client systems. I routinely install MyDefrag on client systems (it's free, remember) and set it up in Task Scheduler. Then when I get a call from a client, "My computer is slow", it actually means their internet is slow, and I always find numerous toolbars, BHO's, etc. plugged into their browser. I uninstall/disable all the extras, and suddenly the computer is fast again.

    When Windows is first installed, there are files scattered all over the hard drive; the installation is by no means contiguous. Hiberfil.sys is often in the middle of the drive. Can't shrink your OS partition below 500GB? It may well be that you have a system file (unmovable) sitting at the 500GB position on the platter. And no, it doesn't make a lot of sense. But don't take my word for it; try it on a system, and immediately run your favorite defragger to get a GUI display of fragmentation and which files are where.

    On a new install of Windows, $Logfile is almost always contiguous, and close to the beginning of the disk. A first run of MyDefrag will put lots of free space before and after $Logfile, to give it plenty of room to grow, yet remain contiguous. Every subsequent run of MyDefrag will monitor this free space around $Logfile, and add to it as needed. The end result is that $Logfile doesn't become fragmented, and remains contiguous.

    I have no need for a boot-defrag, because I dual boot. If I felt the need, I could defrag one OS from the other and move system files that are not in use. Then when I'm booted back into that OS, the optimization algorithms of MyDefrag take care of any tidying up running under Task Scheduler. MyDefrag can be installed on a Rescue Disk for those who don't dual boot, and accomplish the same thing.

    My bottom line remains the fact that my systems continue to perform as crisply and flawlessly and error-free today as they did when I first set them up; this particular one beginning its fourth year. What I found that worked for me years ago still works for me the same way today.
    Create a new drive image before making system changes, in case you need to start over!

    "Let them that don't want it have memories of not gettin' any." "Gratitude is riches and complaint is poverty and the worst I ever had was wonderful." Brother Dave Gardner "Experience is what you get when you're looking for something else." Sir Thomas Robert Deware. "The problem is not the problem. The problem is your attitude about the problem. Do you understand?" Captain Jack Sparrow.
    Unleash Windows

  13. #42
    2 Star Lounger
    Join Date
    Jul 2012
    Posts
    102
    Thanks
    0
    Thanked 4 Times in 4 Posts
    I have 2 terabyte and 3 terabyte drives on my rig. It is still a good idea to use disk defrag because often times defragmentation cleans up some of the mess in the file system such as reporting file space. The difference with larger drives is that you do not have to defrag it as often, but they still need to be defrag's at some point; once a month is not extreme. Mechanical hard drives have increased in capacity but they mechanically more or less still work the same way. The problem that you will experience with larger hard drives is not only the heat but finding a way to back up 3 TB can be quite a chore. If you have an external 3 TB drive, that would be okay but at some point it is going to get "strange". However, there is no way around this. In fact, the more stuff you put on a large hard drive, the greater the need for imaging and file backups. Having a large hard drive means to have more stuff to lose!

  14. #43
    4 Star Lounger
    Join Date
    Mar 2011
    Posts
    579
    Thanks
    4
    Thanked 35 Times in 30 Posts
    To reply to the original question, I recommend Diskeeper. It is a paid but affordable product (the home version is thirty bucks and is good for three computers), it does the job in real time, and it gives you a place to go if you run into trouble with it. Set it and forget it, although an occasional visit to see if it reports any problems is in order. (One problem that I don’t know if they’ve cured is the ‘program not found – skipping autocheck’ error message that can turn up at boot time in certain circumstances. You can safely ignore it, but it may drive you crazy if you don’t go through whatever it takes to get rid of it.)


    I suggest you view the SpinRite video, which is interesting, and see if there is any freeware (or SpinRite itself) that interests you. SpinRite is not a defragmenter, but a disk maintenance program.


    I mentioned System Mechanic in a post, and if already have it you may know that they have got on the optimization bandwagon in a big way in recent years, complete with an alarm to tell you it’s time you optimized. It is very good at detecting drive errors early in the game, and yes, it has a defragmenter. A number of other things in the suite work well – I just turn off everything I can find that goes on behind my back and run it manually, using what is useful to me. Prices are all over the map, so shop around if buying.


    MyDefrag is something I keep on the computer(s), and these days I just use that for defragging external data drives. Running it (or any other) on a daily schedule if you are set up for it is probably the secret to success, and keeping your system partition optimized makes a real difference in performance.

  15. The Following User Says Thank You to dogberry For This Useful Post:

    scaisson (2014-02-03)

  16. #44
    3 Star Lounger
    Join Date
    Mar 2010
    Location
    USA
    Posts
    258
    Thanks
    49
    Thanked 32 Times in 25 Posts
    Assuming most defrag software are safe, defrag is still time consuming for huge drive. Copy-out/reformat/copy-in, in my 2TB-drive case, is faster than using MyDefrag.
    With the understanding of fragmentation of $Logfile, after copy-out/in, I can elect to do a quick defrag (for the $Logfile), instead of reinstall Windows. This 2-strp seems faster, and safer too. With a backup to boot.
    @MQG1023
    Too huge a drive, risks losing huge data... Good point. With my nightmarish experience on the 2TB, limit on-the-field hard drive to 1TB is a good compromise. Only use huge drives for backup storage.

Page 3 of 3 FirstFirst 123

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •