Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Imaging Performance Testing: MR8, ETBH, HBSP
#21
@admin,

Quote:During our testing, we first ran MR to create an image, followed by HBS. In this scenario, no matter how much we optimized our code, HBS could not match the performance level of MR, which consumed a significant amount of our time. Later, we inadvertently deleted all images and first ran HBS to create an image, followed by MR. HBS’s performance improved significantly, and only then did we realize the issue inherent to mechanical hard drives.

This result you discovered is what happens in a real-world environment.  My Target internal 4 GB WD HDD always has six complete disk images on it: two each from MR8, ETBH, and HBSP.  A prudent backup strategy includes having more than one system image.  Each Friday, when I use one of three programs to image my Disk 2 (OS SSD), I delete the oldest backup of that set of two, and manually initiate a new system image, at which point I am down to only five (5) previous disk partition images.

I submit that seldom is a prudent user only going to have only one backup image on a target drive.  If a user does so, s/he is placing all of their trust and data integrity on one backup image, which could be corrupt, or become corrupted.  That, to put it simply, is just plain stupid.

I submit that Hasleo should cease relying on "perfect" Lab testing scenarios (a completely virgin Target backup drive) to assess the performance of HBSP in comparison to the other two products I am testing.  Both MR8 and ETBH contend with the same issue on my Target image drive: it is never "virgin."


I thank you for your kind words, and for appreciating that I am only conveying the results of the testing of MR8, ETBH, and HBSP on my hardware configuration.  I am pleased to be able to contribute concretely to the evolution of HBSP as a first-class imaging solution.

"Explaining" my results does not EXPLAIN how MR8 is, so far, able to image and verify faster than HBSP.  That being said, I think that HBSP, a relative newcomer to software imaging program product line, has no reason to justify their continually improving performance.  The HBS explanations that are being offered are issues with which all imaging programs must contend.

I have NO doubt that HBSP will continue to evolve and improve.  I am happy to be able to contribute my limited testing results to inform the Hasleo Backup Team of how their program works on my computer configuration.  As some say in English, I "have no skin in the game." I own licenses for all three products.

I do have a bias however, which I freely admit, to see HBSP overthrow MR as the best imaging solution at some date in the future.

Happy New Year to the Hasleo Software Team, and to you, @admin.

Regards,
Phil
Reply
#22
Dear @Phil,

Thank you for sharing your detailed real-world testing experience and practical insights. Your multi-image backup strategy is a valuable example of prudent data protection, and your point about testing in non-"virgin" environments is well taken. This kind of feedback is exactly what helps us move beyond lab conditions and improve Hasleo Backup Suite for real-world use.

We appreciate your balanced perspective and your support as we continue evolving. Your testing contributions are genuinely helpful to our team.

Wishing you a wonderful New Year, and we look forward to continuing to evolve with everyone's feedback. 🎉

Best regards,
Reply
#23
@admin,

Thank you for your post.

As you noted, it would be great if more HBS users would share their feedback as to how it performs on their hardware configurations, even if they only run HBS.  That information from other users would be invaluable as to identifying HBS bottlenecks that could be eliminated or mitigated.  I continue to hope that other users will share their results, so that the Hasleo Team can be apprised of more examples of the "real world" performance of HBS.

To be honest, I think that the fact that my previous posts could be seen to have been "jumped on" by Hasleo, and other Senior Forum members, might be a disincentive.  That is why I have been strident in recommending that the Hasleo Team always welcome credible testing results.  "Explaining" or possibly being seen to "criticize" Forum members and users for their results is NOT in the best interest of the Hasleo Team and the evolution/development of HBS as a best-in-class imaging solution.

All the best in 2026.  I look forward to continuing our dialogue.

Have a great day.

Regards,
Phil
Reply
#24
Maybe I'm in the minority here, but it seems like continuous posts like this aren't as helpful as one may think. The HBS team is very small, and having to constantly extract data from, what I assume to be, very similar posts only in and of itself create bottlenecks within the team. If a build specifically mentions creating time, restoration time, verification time tweaks, etc., I will gladly test that new build. But other than that I prefer to let the cooks, cook, as it were, and not needlessly bother them.
Reply
#25
I didn’t want to write anything at first but seeing that this topic seems to be going on forever I‘d like to at least say this:

I don’t think Hasleo or any other forum member was criticizing one of your tests. To be honest I don’t see why you think or feel that this could be the case. Every feedback is welcome and I think Hasleo always tried to make this pretty clear.

But nevertheless it is still helpful to mention possible flaws in tests and how to prevent them. For me personally, I wouldn’t be discouraged or "disincentivized" to post any testing results after reading this thread. Doesn’t feel like "explaining" to me.

Just my 2cents, hope I didn’t start another discussion now  Big Grin

Cheers, everyone!
Reply
#26
Well, you did start or at least continue the discussion. Smile
I agree with you, anything that relies on the presentation of data has to be open to clarification and a discussion of possible flaws and how the results apply, or not, with the main question - in this case, how does Hasleo speed stack up against the other programs. Hasleo was correct to give a reason why their HDD image checking may show slow results. The OP was correct to state that it was important that HBS results reflect real-world use rather than a "virgin" system. I can tell you from real-time software testing that you must run your tests on virgin systems to set a baseline because there are variables in a dirty system clouding what your program is doing. Unfortunately, the variations in PC hardware and configurations makes it impossible to say that in every case program A will outperform program B.
Reply
#27
Quote:The HBS team is very small, and having to constantly extract data from, what I assume to be, very similar posts only in and of itself create bottlenecks within the team. If a build specifically mentions creating time, restoration time, verification time tweaks, etc., I will gladly test that new build. But other than that I prefer to let the cooks, cook, as it were, and not needlessly bother them.

I would respectfully disagree that users providing credible testing results could potentially cause "bottlenecks."

I would most certainly agree that "The HBS team is very small, ...".  In my view, that makes it even more important that they obtain "real-world" results from multiple user platforms.

The end goal of Hasleo, I submit, is to produce an imaging solution that excels on all manner of hardware configurations.  That is what will incentivize people to purchase and use the Hasleo imaging solution.

Having only a small team limits their means to create a massive testing environment.  We users can assist in that regard; and, that is the only point of my testing: to assist Hasleo.

Have a great day, and all the best to Hasleo Forum members in 2026.

Regards,
Phil
Reply
#28
Dear All,

Thank you for your continued attention and valuable discussion.

First and foremost, we want to make it clear that every user's testing and feedback, including the detailed tests from garioch7, are extremely important to us. These diverse test results from real-world scenarios are an extension of our eyes and ears. As everyone knows, garioch7's testing helped us improve image detection speed and identify the issues causing performance regressions. This is a perfect example of how community strength directly drives product improvements. For a team of limited size, such assistance is invaluable.

At the same time, we fully understand n8chavez's perspective. His concerns stem from a genuine care for the team's efficiency—hoping that we, the "chefs," can focus wholeheartedly on "cooking" and devote our energy to core development. This is not to negate the value of testing but rather to hope that the "input" from testing can be integrated in a more efficient and less disruptive manner for the team.

Our goal is to strike the best balance between these two aspects, ensuring that critical issues can be quickly identified and addressed while avoiding information overload that could impact the development pace. We continue to welcome and appreciate everyone's testing and sharing of results, as the testing insights from every dedicated user can be more efficiently transformed into the building blocks of product progress.

Once again, we extend our gratitude to n8chavez, al3x, CDC9762, garioch7, and all users who have participated in the discussion and testing. Every contribution you make—whether it's in-depth test reports or thoughtful reflections on processes—helps us move forward more effectively.

We look forward to continuing our partnership in building a better product, together.

Wishing you all a happy New Year and a pleasant holiday season.

Best regards,
Hasleo Software Team
Reply


Forum Jump:


Users browsing this thread: 12 Guest(s)