You may have noticed that my blog has been off line for a while now. Took me a couple weeks to find the time to rebuild my machine after a hard disk failure. It’s still not 100%. Images are not appearing with the blog articles for some reason. I hope to have that resolved this weekend.
I installed the latest CnPack’s Wizards 1.09.89 a couple days ago and DX Seattle suddenly seemed much slower, sometimes pausing for several seconds before presenting an hour glass and coming back to life. A few times Windows actually detected it as not responding and prompted if I wanted to Debug or Close Delphi.
Today I uninstalled CnPack and DX Seattle is again responsive. Too bad, I like a few of their add-ins. Hopefully a subsequent release will resolve it.
At work I have an SSD which is only 220 GB so it is almost full almost all of the time. Gone are the days of lean software deployments, and I am a hoarder when it comes to information. As a result, I was attempting to install DX Seattle on my SSD and Install Aware told me I had sufficient space after unselecting various features.
After proceeding, Install Aware told me it does not have enough!
In the end I did manage to free enough space to get DX Seattle installed, although it was like playing Russian roulette. I ended up removing vital components for SQL Server Management studio that I then had to repair.
I recently upgraded to DX Seattle which contains the long awaited (7+ releases) resolution for the “Out of Memory” issue. I was pleased to see that DX Seattle is now Large Memory Address aware, in addition to using memory more efficiently according to Marco Cantu.
So far, after installing Update 1 with the Modern Style hot fix I have not experienced a single Out of Memory error. As a result, we are upgrading our licenses for the first time in 3 years even though there is a big push to move the core products to .NET. If only EMBT would have addressed this earlier, perhaps we wouldn’t be so motivated to move.
I got notified today again about the IDE memory consumption issue (RSP-9568) I continually encounter even in XE4, although it seems to be more severe in later versions. The Atlassian graph is very telling. It seems EMBT is not even treading water, let alone making progress addressing quality issues.
So without buying the latest release, when or more to the point if issues like the Out of Memory problem actually get fixed, do you really think you will get an update for free? No you will have to buy the latest release with all the new bugs.
Effectively you are paying for the new bugs at the same time as you pay for the old ones to get fixed (if they do), not to mention the cost in terms of lost productivity, component upgrades, and installation time.
With .NET Native coming, soon Delphi will not even be able to claim to be king of the hill in terms of ease of deployment and fast startup times…we certainly cannot claim to have a superior IDE in terms of stability or productivity even it appears with the addition of Castalia.
I tried to post a comment on Nick’s blog, but for some reason I never got the email for verification of my address so I thought I would respond here.
I find it very interesting that a former EMBT employee still with close ties, would write such a post. I understand how he drew some of his conclusions, but I think he is mistaken and I hope EMBT does not share the same views. Understanding the marketplace is very different from software development.
First of all, code quality is more about developer skill, and the processes employed during development and QA. The earlier defects are found the cheaper and easier it is to fix them. The number and age of unresolved bug reports does not speak well to EMBT’s (and their predecessors) concern for code quality. However, the primary driver, as with all things produced by a company, are the decisions made by management. While no code is perfect, if you accept that it is not possible to produce bug free software then you have lost the incentive to try.
Having “volunteers” go through QC reports says that a company isn’t even willing to pay their staff to properly evaluate bug reports. Having a broken voting system and web QC, as well as an antiquated Windows client that not being maintained speaks volumes about what is important to a company. Admittedly, the bug tracking systems have finally changed. It doesn’t seem to have affected the number of fixes for long outstanding issues though. As people we spend our time on what we believe is most important. Same thing for companies.
Allowing half baked (incomplete) or non-working visible things like the panel at the bottom of the object inspector that Marco wrote an add in to remove, Error Insight, and the Refactorings remain in the product for years without fixing them is not only stupid IMHO, but a software development nightmare. Web companies are making money on minimalist applications that satisfies a small need, but does it extremely well. EMBT seems to prefer the old school thinking of a humongous application that does everything under the sun, but does some things very poorly, and taints everything else with the same code smell. They don’t even seem to be willing to consider using the plug in system to provide ‘a la carte’ choices to their customers, or drop existing functionality that is not required to limit their technical debt going forward. What other company has someone external to the company provide bug fixes for their IDE that apparently never get incorporated into the product because Andreas is still churning out IDEFix pack!
Only in an emerging marketplace can you continue to sell flawed products that look good on the surface, but don’t work as well as they should, to be used daily for someone to make their living. Development markets are not emerging ones, nor are they without stiff competition.
Developers are smart people. EMBT is fooling no one with their choices when it comes to resource allocation. Hire cheaper developers around the world to cut costs, focus on emerging markets where quick profits can be made, and increase the cost of the tools by coupling the mobile pack with the core dev tools and charging for each, effectively double dipping, while raising the cost of the product across the board and providing diminishing product quality. Typical of a company acquired to provide profits to it’s purchaser, and not for any ideological reason.
The definition of insanity is doing the same thing over and over expecting a different outcome. Developers will only continue to buy EMBT licenses or subscriptions as long as they have to in order to support their products, or where their is some advantage (none come to mind right now) because they aren’t insane. EMBT on the other hand cannot continue indefinitely down this path, unless they want to drive away their user base, or intend on getting as much cash from their users until the bottom falls out, and they close the doors.
Delphi is no longer a bleeding edge tool, even in the mobile space. It has more competition than ever, and a major challenge with Microsoft/Xamarin/.NET Native and Apple competing in the market place. One of the reasons for it’s current success is the older user base supporting existing applications built when Delphi was bleeding edge, or trying to use their skills in the mobile space. That ride won’t last forever.
Perhaps I have digressed a little, but my point is that EMBT has shown it’s colours. They don’t care about code quality or their actions would speak more to it. Instead they want to add just one more feature into the box. Perhaps a subscription model would help a little, but it won’t result in more bugs fixes. We have been voting with our wallets for some time now, in QC, and on the forums and they are still not getting the message! As a customer I would have to be insane not to recognize their pattern, and refuse to pay for bug fixes (it’s a great money making machine if you can convince users to do so), and choose another toolset from a company that does care. If they did we would have seen bug bounties a long time ago…
At work we’re developing C# .NET replacements for our Delphi apps as we speak! Good luck to EMBT. XE4 with the infamous ‘Out of Memory issues’ will be the last release we buy…
One of the newest ORM entries into the commercial market is EntityDAC from Devart. If you are a Delphi developer you have probably heard of Devart, even if you haven’t actually used one of their products. The company has been developing Delphi based technology since 1997 when they released their Oracle Data Access Components (ODAC). Devart has specialized in data access related technologies, and since Delphi has predominantly been used to develop database centric applications, if you haven’t heard of Devart, chances are you were living under a rock somewhere.
While there are numerous ORM/OPF solutions available for Delphi, only a few use the latest language features of Object Pascal. hcOPF, TiOPF, InstantObjects, DObject were all conceived prior to the appearance of generics, anonymous methods, and the new RTTI which has opened the door to dependency injection, and other modern best practices previously not available to Delphi developers. That is not to say that none of these frameworks have adopted newer language features, just that they were not initially written with them or require newer versions of Delphi. mORMot is an exception. Of the current ORMs (if I missed one please let me know), only TMS Aurelius, DORM and now EntityDAC require a later version of Delphi (2007 and up).
In August 2014, Devart released the first version of EntityDAC for the Delphi platform. EntityDAC builds on 18 years of expertise Devart has acquired developing database drivers and other ORM related products such as LinqConnect, EntityDeveloper (a designer for DB models) and dotConnect. There have been several updates since it’s release, so unlike many open source solutions, you know it is actively being improved.
EntityDAC uses an object TDataSet descendant to present data, and recommends the usage of data aware controls to enforce validation so for Delphi developers used to using TDataSets it is as close to a drop in replacement as you can get. EntityDAC supports Lazy Loading, Code-First, Model-First, or DataBase-First design, and is well documented.
One of the coolest features that I always wanted to implement for hcOPF was visual object design. A complementary product; Entity Developer is bundled with EntityDAC. It allows you to reverse engineer a database and create Delphi business objects as well as designing them from scratch. It can then generate your model code so you can immediately start consuming your business objects. Entity Developer is a Visual Studio shell based application that is also capable of using T4 templating supported in that IDE, so you can tweak the code generation as you see fit.
One of the questions I debated with other developers when I first wrote hcOPF was whether the ORM should support PODO (Plain Old Delphi Objects aka TObject descendants) as well as those descending from my base class ThcObject. Limited RTTI precluded that possibility back in Delphi 3, and I have even interfaced ThcObject to experiment with enabling developers to use RefCounted PODO objects. With EntityDAC you can use either TObject or TEntityObject as your base class. It also supports using Delphi attributes to determine the database mapping or XML files.
Not only is the solution very flexible in terms of the workflows supported, the getting started Wizard makes it easy to get an application up and running, and an example app showcases some of it’s capabilities. The icing on the cake has to be LINQ which enables you to remove the SQL from your code and the coupling to the database it represents.
While I would love to dig deeper into EntityDAC, this is already getting to be a long post so perhaps I will write subsequent ones if there is enough interest. Suffice to say, if you are looking for a commercial ORM solution backed by a company with almost 20 years experience in delivering high performance database centric solutions, I would recommend evaluating EntityDAC.
Get Awesomeness is a curated list of awesome Delphi frameworks, libraries, resources, and shiny things. Inspired by awesome-… stuff. It’s nice to have one stop shopping to open source frameworks of interest. Certainly beats searching on source forge, Google code, GitHub and all the other sites.
Note that only open-source projects are considered. Dead projects are mainly ignored except for those which do not have alive analogs. Feel free to suggest other missing nice projects either by comments or pull requests.
Last night I I was reading some of the latest articles on Jon lennart Aasenden’s blog about Quartex. For those of you not following Jon on Google+ he is the founder of the DelphiArmy and the author of the Smart Mobile Studio IDE. Quartex is an IDE for multiple languages (including Object Pascal) that Jon is working on that is to include the transcoding of languages. IOW, you could take Delphi code and transcode it to C++ or C#. The approach is to use the LDEF intermediate format, and from there you could either convert to a different language or compile it into binary code.
Quartex will be built on Jon’s cross platform framework and experience, and he is looking for others to join in. If the effort proves successful there could be another IDE choice other than Lazarus for XPlatform work. Imagine, a native IDE on the Mac compiling for OS/X or iOS, or a Linux IDE targeting Linus and Android. No more networking between machines with intermediate apps like PAServer, or using VMs to target mobile devices.
No more .NET subsystems like ErrorInsight that produces lots of false positives and never gets fixed. No more modelling support that no one uses, Refactoring that works some of the time and blows up out of memory or never comes back and no more half baked features that are abandoned. If Quartex ends up open source, and you find a bug that bugs you enough, you can fix it yourself or hire someone to do so.
Product direction would be a community decision, and both voting with your voice and your wallet would yield results. Object Pascal as a language would have a much better chance of surviving! It’s either that, or we continue with the high # of Quality Portal issue reports, and articles like this one that show how EMBT’s strategy is working in terms of quality (72 reports over the last 30 days without a single resolution).
I also thought I would mention that Marco has been on the EMBT quality portal. Good to see someone from EMBT providing an explanation of the complexity of the issue and what they’re doing.
What I always fail to understand is why developers (especially Delphi devs), tend to write programs using datasets with the normal RAD approach when all the benefits of Object Oriented Programming (OOP) are lost by doing so. Datasets were used back when I wrote Clipper programs in 1992. Clipper was not object oriented, so it’s like stepping back in time, and ignoring all the benefits that OOP has demonstrated over procedural programming.
When I was first learning OOP, I was taught “there are three pieces of PIE” where the acronym PIE stood for Polymorhpism, Inheritance, and Encapsulation. You could of course say there are slightly more than 3 pieces (3.14) in a Π, or perhaps its just a case where the whole is greater than the sum of the parts.
IMO the characteristic or benefit most often overlooked when writing Delphi code is Encapsulation. Encapsulation is what enables the de-coupling of classes, which affects your ability to maintain code over time with minimal breakage.
A good test of any code base is how easy it is to create a new application and re-use a class. Try it, and you will soon find out what the prerequisites are, and how intertwined the code is. Not only does this affect your ability to develop in an agile fashion, it means re-factoring the code for better performance is more difficult as well. One of the most common issues with legacy Delphi code is that all the work is done in a single thread (the main VCL thread). Progress reporting usually involves calls to Application.ProcessMessages() to ensure processing results are displayed in a timely fashion. What this means is that the processsing takes longer, and is even more difficult to move it to a background thread because it is tied to the VCL message loop as well as other objects in the main thread.
If I had to write an application from scratch I would:
1) use an ORM whenever possible. DataSets are compact and fast to retrieve. Where they fall down is the encapsulation of data and behaviour (business logic) which changes over time. ORM objects are more flexible, and they usually provide a validation framework as well as persistence. Sooner or later datasets will force you to break the DRY principle in anything but a trivial application.
2) use MVVM. While MVVM frameworks for Delphi are in their infancy, the concept can be implemented on a case by case basis. The idea is to keep as much code out of the View (form) as possible. What you really need for MVVM is some kind of object/data binding.
3) implement your data processing on a secondary thread. If you do so right off the bat, there is less of a chance that the data access will ever be tightly coupled with anything on the main thread, and parallelization becomes trivial.
4) use TDD to write at least the core classes in your onion architecture. The foundation on which you build an application needs to be bullet proof, otherwise you’re just building a house of cards, and when problems arise, or changes are required the termites start to come out of the woodwork. This also has the benefit that you can test and optimize the performance of each piece so once it’s all put together it should be as lean as possible. TDD forces developers to document the code they write in the form of tests.