You are browsing the archive for Mike Orriss.

Welcoming the new RemObjects Software Developer Evangelist!

October 1, 2013 in non-tech

RemObjects is very pleased to welcome Steve Scott (aka “Scotty”) to the post of Developer Evangelist for the company!

ScottyScotty brings a wide range of skills to the table, as his CV shows:

Steve Scott  has  been a  developer for 26 years. He started as a developer in 1987, when he was writing accounting software using COBOL on a Convergent Unix machine using vi as his code editor. Since then he has worked on mainframes, minis, 16bit and 32bit Windows (C++, Paradox and Delphi), .NET (C# and Visual Basic), OS X and iOS (Objective-C and Cocoa). During his career he claims to have worked on more hardware and learnt (and forgotten) more languages and IDEs than any sane man should have to.

During the late 1990s he was an active member of the Delphi community spending some time as the technical lead of the UK Borland User Group (UKBug), as well as being the technical lead for the popular DCon conferences and a regular speaker at BorCon.

As the 21st century took hold, he made the move to .NET and became a respected speaker at many .NET conferences around the world.

In 2007 he jumped ship and headed into the world of Apple to work on Mac and iOS software. In this new world he became the founder of the iDeveloper Blog, host of the iDeveloper Podcast and creator of NSConference.

Scotty will be focusing on improving the user-facing side of the company – work on bringing our products to the attention of more developers world-wide, improving accessibility and discoverability for new users, and working with the teams on enhancing the over-all user experience. You can also expect some more RemObjects TV and Radio in the near future, as well.

Outside of his responsibilities at RemObjects, Scotty will continue to be an asset to the developer community at large, thru his other projects such as the iDeveloper site and podcast, and NSConference.

Announcing the RemObjects Connect online support forum

October 17, 2011 in non-tech, RemObjects

Dear customers,

Earlier this year, we introduced you to a beta version of our new web based support community that we created to replace our newsgroups (and go beyond).

While here at RemObjects we’ve always been big fans of NNTP, we’re finding it increasingly difficult to ask new (and younger) developers to set up news readers (which many regard as legacy technology) to get support, and NNTP also makes it harder for “casual” visitors to our site (such as potential customers) to take a look into the support community before making a decision to go ahead and give our products a try.

We think a web based solution, if done right, will solve these problems, and give us a lot more that NNTP can’t do. There are other advantages, as well.

After reviewing many different options, and not really being happy with the majority of “phpbb” like forum solutions out there, we have finally got something that we are happy with. A web solution that lets you easily connect to the forums from multiple machines without losing track of what you have already read and what is new; it lets you subscribe to email notifications or subscribe to/favorite interesting threads.

We’ve been testing this solution for our private beta groups and with TeamRO for a while now, and in May we opened it up, in “BETA” mode, to all customers. Now the time has come to officially switch over to the new system.

Initially, “RemObjects Connect” will provide web-based forums in a “Stack-Overflow” style (which we think works better than traditional web forums, that are usually clumsy to navigate and hard to stay on top of). Over time, we plan to add more features, such as a marketplace for RO-related services (contractors, job-seekers, etc), “social” aspects and more.

To get started, go to connect.remobjects.com. For existing customers and people on our mailing list, if you click login, this will redirect you to our portal site, ask you to sign in with your usual RO website login, and then send you back.

For guests, i.e. those without logins, the public forums are available but in read-only mode, thus allowing prospective customers to review our support. Also, guests may sign up for an account, to gain read/write access to the public forums.

We plan to switch the “old” public newsgroups into read-only mode by the end of next week, and we encourage everyone wrap up their existing conversations. We will keep the newsgroups around “for reference” for a while longer, and are considering options for migrating their content into a searchable web database, for the long term.

Hope to see you on RemObjects Connect!

Mike

Use of Data Abstract’s Memory Tables without a Database

July 9, 2011 in Data Abstract

This post was prompted by my reference to TDAMemDataTables in my recent podcast about our wiki.

First, a bit of background: ever since I first started developing, I’ve always been a bit obsessed by execution speed. I think this started with my first ever contract back in 1973 in IBM 360/370 assembler for a book publisher. The task sounds simple – produce an application to print address labels fast! Back then though, the notion of printer drivers didn’t exist – we wrote directly to the hardware. The twist to this contract though was that we got paid by the speed improvement over the existing system. Currently, lines were printed one line at a time and the label size was inflexible. I produced a solution that took two parameters (label width and page width) and wrote complete pages with a single call to the printer, thus getting almost a factor of 50 in speed improvement.

Later, 20 years ago, I had a Paradox for Dos contract to rewrite budget reports for a large international country. The thirty reports themselves were fairly simple and similar, apart from currency conversions everywhere, but as before, speed was the problem as the existing reports were taking nearly all night to produce. On examination, the reports formed a hierarchy but they were all written against the main data tables. Paradox was particularly slow on wide tables (i.e. with many fields). It was a simple job to produce several temporary narrow tables at suitable places in the hierarchy, and this resulted in a total run time of less than two hours. Such things help one’s reputation and the initial three months was extended to nearly five years with me finishing up as a project manager :)

These two examples illustrate one point – the power of cacheing to improve efficiency. Since adopting Delphi in 1994 (very impressed by its compilation speed) I’ve always tried to write efficient code. Nothing annoys me more than tardy screen refreshes etc. So, over the years I’ve tried many cacheing techniques with Delphi. For example, MemoryStreams, StringLists, IniFiles etc but they all fell short in one way and/or another.

Since Data Abstract introduced the Local Data Adapter (LDA), I’ve found a much better way of cacheing and that is what this blog is all about. Returning to my podcast reference at the top, it is easiest to explain using a simplified version of the problem I had to solve.

We generate the basic structure of wiki pages using XML files as basic input. These XML files contain data obtained from the interface sections of source files of all our products. As some wiki pages contain data that ultimately comes from up to three different XML files, some pre-processing is inevitable. This data can be used many times before the XML files are re-processed, thus causing updates to the data actually used by wiki page generation.

To keep it simple, there are four main temporary files used:

  • Pages: each page entry has a list of Types
  • Types: one or more entries per page
  • Members: contains child records for Types
  • Signatures: contains child records for Members

These four are the most important of about eighteen I actually maintain. The extra tables are mostly subsets of these four. For example, the Parents table contains the parent of each class, interface etc. and its index within the Parents table. The display text for the parent is stored too (i.e. whether displayed as a page link or normal text, so page production does not have to check each time it processes the class hierarchy).

My app is written using Delphi 2009, so these tables are used via TDAMemDataTables and loaded/stored to/from .daBriefcase files. Why do I do this, you may ask? Well, this is a single user application and all these files *can* be generated from scratch but they are purely there to gain speed. The only change from the standard database use of TDAMemDataTable is to hook up the LDA instead of the normal Remote Data Adapter (although field creation is different, as discussed below). All the usual functionality is there but I tend to use them in a slightly different way.

Having dropped a new TDADataTable component onto the DataModule and hooked it up to an LDA component, you add fields manually to the table via a right click. All the standard types are available and once you have all needed fields, you can save to a .dabriefcase file. An empty table is then available to my app by merely loading it from the file. All the normal operations are available for inserting/updating/deleting etc. and you can save back to disk at any time.

I use these datatables in a slightly unusual manner, to get as much speed as I can. First, I always set the Filtered property to True, which means that also having Filter=”, the default that means it’s unfiltered. I use the Filter instead of explicit SQL calls. The TMemDatatable filter syntax is very comprehensive (actually translates it into appropriate SQL). Most importantly, it is very fast and an additional consideration is that I can update a filtered table in situ by standard insert/edit/delete followed by post.

For a similar reason, the Types/Members/Signatures tables are not hooked up as master/detail/detail. For optimum performance, I filter the detail tables but not for all master table navigation. When actually needed, setting a filter on detail tables is very fast. Setting appropriate indexes is important though if tables are relatively large. Similar to the way you use ApplyUpdates on the normal Remote Data Adapter (RDA), the same can be done to apply changes on several local tables as a single transaction.

Using these techniques, I can display the structure of a complicated page ultimately derived from three XML files in less than a quarter of a second. There are many other scenarios where this type of code could be useful. Hands up those who have loaded aaa=bbb,ccc,…. type data into a TStringList only to discover that they need to process it in two or more different orders? Setting up a TDataTable and using its filter and sorting capabilites is much nicer to work with.

And Another Thing

When I was developing the wiki code, I soon discovered that being able to display the data in a table would make development easier. It was surprisingly easy to produce an app (which I called BriefView) which would display any .daBriefcase file in a DevExpress grid. I won’t show you everything in my BriefView application as the items missing are very basic Delphi components, such as buttons and dialogs, which should be obvious from the code below.

The Data Abstract components are fewer than you might expect: TDADataSource, TDAMemDataTable and TDABin2DataStreamer. I am also using the TcxGrid component from Developer Express, for reasons you will see below.

FormCreate logic is very simple:

procedure TForm1.FormCreate(Sender: TObject);
begin
  if ParamCount > 0 then begin
    FileOpen.Dialog.FileName := ParamStr(1);
    FileOpened(nil);
  end;
end;

The ShowOpened code is also called via a button to replace the display of a file by another:

procedure TForm1.FileOpened(Sender: TObject);
var
  dc: TcxGridDBDataController;
  fn: string;
begin
  CloseFile;
  dtWork := TDAMemDataTable.Create(nil);
  dsWork.DataTable := dtWork;
  dtWork.LocalDataStreamer := LocalDataStreamer;
  dtWork.RemoteFetchEnabled := False;
  dtWork.LogicalName := ChangeFileExt(ExtractFileName(FileOpen.Dialog.FileName),'');
  dtWork.LoadFromFile(FileOpen.Dialog.FileName);
  dc := tvWork.DataController;
  dc.GridView.ClearItems;
  dc.CreateAllItems;
  fn := ChangeFileExt(FileOpen.Dialog.FileName,'.grid');
  if FileExists(fn) then
    tvWork.RestoreFromIniFile(fn,False,False,[gsoUseFilter, gsoUseSummary],dtWork.LogicalName);
  FileSaveAs.Dialog.FileName := FileOpen.Dialog.FileName;
end;

This is virtually the application. The first half loads the table from disk, including FieldDefs, etc. The second is concerned with the UI only and is all the code needed to set up the grid with correct column widths together with persistance of filtering/grouping/sorting and columns actually displayed and their order.

Note that it starts with a call to CloseFile, which is called when the application is closed too, but first I need to show you SaveFile (which is called via a button):

procedure TForm1.SaveFile(Sender: TObject);
var
  fn: string;
begin
  fn := ChangeFileExt(FileSaveAs.Dialog.FileName,'.grid');
  dtWork.SaveToFile(FileSaveAs.Dialog.FileName);
  tvWork.StoreToIniFile(fn,True,[gsoUseFilter, gsoUseSummary],dtWork.LogicalName);
  FileOpen.Dialog.FileName := FileSaveAs.Dialog.FileName;
end;

Finally, the simple CloseFile:

procedure TForm1.CloseFile;
var
  fn: string;
begin
  if dtWork=nil then exit; 
  dsWork.DataTable := nil;
  fn := ChangeFileExt(FileSaveAs.Dialog.FileName,'.grid');
  tvWork.StoreToIniFile(fn,True,[gsoUseFilter, gsoUseSummary],dtWork.LogicalName);
  FreeAndNil(dtWork);
end;

Summary
This post, hopefully, has given you some ideas as to how underlying Data Abstract technology can be used in your own projects.

RemObjects Mailing List

February 22, 2011 in non-tech, RemObjects

We have become increasing concerned about the increase of bounced emails when we send information out to everybody on our database. When these occur, we set the ‘unreachable’ flag on our database. This can happen the first time it occurs (e.g. for ‘no such address’) or after several successive failures (e.g. for ‘mail box full’,which obviously is no longer read).

The last email we sent to everybody was on 25 Jan for the then upcoming DSConf in Las Vegas. Note: we received a significant number of bounces since the previous DSConf mailout on 7 Jan. Actually, many of these were bounced by www.messagelabs.com, which seems to be by Symantec.cloud technical support.

If you didn’t receive that email when you would have expected to, it’s likely because:
- you didn’t notify us of change of address
- it’s stuck in your span filter
- you or your ISP has bounced it for spam or other reasons.

If you want our emails, especially if you are a customer (for renewal reminders, news of upgrades and special offers) and you can’t find any reason why the email didn’t reach you, please email me at mikeo@ and I’ll find out why we set you as unreachable and when.

Alternatively, if you no longer want to receive emails from us, please write to me at the same address asking me to unsubscribe you.

36 Ways to get your Message Bounced when …..

June 25, 2007 in RemObjects

… you send an email to registered customers.

Last week, we sent emails to everybody on our mailing list announcing the availability of our latest and greatest and we were very unhappy to see the number that were immediately bounced back!

So, we work hard implementing features requested by customers and we can’t even tell some of them what is now available to them! The reasons provided by our customers’ ISPs varied greatly (remember the following list relates *only* to addresses that were registered with us, and needed for obtaining downloads, etc). The 36 reasons were variations on the following themes:

  • Account does not exist
  • country ip access denied
  • Host or domain name not found
  • Invalid recipient
  • Mail appears to be unsolicited
  • message looks like SPAM to me
  • Message rejected due to content restrictions
  • no mailbox here by that name
  • not listed in Domino Directory
  • not listed in public Name & Address Book
  • recipient rejected
  • relaying denied from your location
  • that domain isn’t in my list of allowed rcpthosts
  • The recipient cannot be verified
  • This address does not receive mail
  • This mail server requires authentication when attempting to send to a non-local e-mail address
  • this recipient is not in my validrcptto list
  • your mailserver is rejected by local block list

Those all look like permanent problems, but there were also those that may or may not be temporary:

  • Can’t create output
  • Mailbox disabled for this recipient
  • Mailbox has exceeded the limit
  • mailbox is full: retry timeout exceeded
  • Mailbox unavailable or access denied
  • mailbox unavailable or not local
  • The user has not enough diskspace available
  • unrouteable address

As I said, these all relate to addresses provided by customers to communicate with us. The most usual reason for the problems above is that the customer has changed their email address without telling us. Also, they may not even be aware that our mail is being blocked.

This leaves us in a catch-22 situation: how do we inform people that we can’t reach them? Publicizing names is not an option, as we value the privacy of our customers. The only thing I can do right now is to blog and post on newsgroups to ask customers whether they received our email about our ‘Vinci’ releases.

If you are one of our customers and would have expected to receive this email, please write to support with sufficient details for us to identify you from an address which we can reply to, and we will get your account brought up to date.

Conversely, if you received an email (because you expressed interest in our products at some stage) and no longer wish to receive them, please reply to the email asking to be removed from the mailing list. We are not spammers and certainly do not want to waste our and your time with unwanted emails.

Vinci Part #10 – Updating Website Articles

May 12, 2007 in Data Abstract, Wiki

Over the years, RemObjects Software has accumulated a lot of online articles and I do mean a lot. The big problem for me is that they were written at different times and our software *never* stops evolving, so many are out of date to a greater or lessor degree.
So, all hands to the pump to get things sorted, but this then presented a different challenge – how to manage it without taking all the existing articles offline?

We solved this by creating two issues in our bug database for each affected article (existing and planned new ones). A ‘prepare’ issue manages all the potential changes and the ‘publish’ issue handles the actual update including creation of the attached pdf file etc.

With updates occurring over a matter of weeks and with several people involved, this was likely to be rather confusing to our customers, so we needed to make it very obvious as to the status of each article.

Accordingly, each article that we have examined to see if it needs changes for ‘Vinci’ contains an unmissable image in its top right hand corner and they can be one of the following:

  • ready for ‘Vinci’
  • update pending
  • TeamRO contribution
  • external contribution
  • legacy article

Thus, unless you see ‘update pending’ displayed within an article, you can be sure that we are not currently working on it. Note: at the time of writing we have not added icons to all articles yet – we are still working through them – as I said, we have a lot of articles!

To help you filter articles, we have also added ‘ready for Vinci’, ‘update pending’ and ‘legacy’ categories (contributions already existed).

We have many new articles planned but it would be premature for me to mention them by name. Keep your eyes on our website – they will be published as they are ready.

ROFX 5.0 to focus on .NET 2.0 and Visual Studio 2005

January 17, 2007 in Data Abstract

Just before Christmas we canvassed our customers to see whether anyone could see a need for Visual Studio .NET 2003 support for further development of ROFX (the RemObjects SDK, Data Abstract and Hydra).

The good news is that we did not receive a single reply mentioning VS2003! We did, however, receive several queries about ongoing support for Mono, but that’s a different story (in short, we will continue to actively support Mono, and consider it an important target platform).

Accordingly, we are planning to focus our new development on .NET 2.0 and Mono in Visual Studio 2005 and Borland Developer Studio “Highlander” for version 5.0 of ROFX to best leverage these new tchnologies. ROFX 4.0.x will continue to be available to users of VS2003 and .NET 1.1, and we will keep the option of support for these releases with critical bug-fixes *beyond* the release of version 5.0.

Annual Project Report for RemObjects Software

December 20, 2006 in RemObjects

It is just over a year since became General Project Manager here, so this is a good time to reflect on what has happened within and outside the company over this period, together with some thoughts to the future.

It has been a good year for RemObjects Software despite various uncertainties in the market place. At the beginning of 2006 we faced some of the usual problems that hit every company as it grows — the need to consolidate and maintain existing products and features while continuing to innovate. At the time, we were having problems matching the promises being made by our Roadmap (www.remobjects.com?roadmap), largely because there was no overall cohesion — work on one product tended to delay another. Accordingly, we devised a coordinated schedule to ensure that ongoing work occurred on all current products. This was phased in and we met our first target at the beginning of July with our first simultaneous release of updates to all products.

Since then we have maintained a two-monthly release schedule and have met every target date. Indeed, we were actually able to release a week early last weekend! For 2007, we intend to continue shipping bi-monthly updates, but more on that below.
In the Delphi world, the year was overshadowed by the uncertainty over the Borland/CodeGear story. We are often asked whether we intend to continue supporting the Win32 product line and the answer is a resounding “yes” — and, as you will see below, we intend to do more than that.

Our current focus consists of three main areas

  • Existing Win32 and .NET products (Data Abstract and the RemObjects SDK)
  • Cross-platform and .NET migration support (such as Hydra 3 and cross-platform interoperability in RO and DA)
  • Chrome

Taking each of these areas in turn, I will talk about achievements during 2006 and plans for 2007 (without giving too much away).
For the existing products, 2006 was mainly a year of consolidation, although there were several highlights. In particular, the Super TCP Channel was extremely well received and we are now considering other new channels. I think many of our customers were surprised by how easy it was to switch to the new channel, a tribute to the underlying architecture. In 2006, Data Abstract saw focus mainly on ease of use and the new templates and wizards definitely help the beginner get started. Next year, we intend to continue on this front, but we also have some exciting new technologies on the drawing board. My lips are sealed though because it is too early to know exactly what features will be available when. I can say, however, that a lot of the preparatory work towards providing a lean and mean client-side dataset to use instead of the standard Delphi dataset has already been done with extremely pleasing performance results.

At the beginning of the year, while tossing around ideas, we thought that providing a bridge between Win32 and .NET would be cool. Allowing developers to taste .NET within existing applications and being able to migrate fully or partially at their own pace just had to be good. Hydra 3.0, released last week, is the result. This product is unique and we are very proud of it. Having said that, we definitely regard it as a starting point only and you will see major enhancements over the coming months.

Chrome is the third prong of our focus. Our mission with Chrome is very simple: provide a leading edge compiler that supports all the latest functionality coming out of the Microsoft labs, thus allowing you to experience the new ideas as swiftly as possible. This has meant regular trips to Redmond, where we have been able to work in integration with the new technologies with the help of the Microsoft developers. If you are a VSIP partner, you may be interested to see the “Visual Studio Form Designer Integration” article which we wrote for Microsoft. It is 30 pages long and published as part of the Visual Studio SDK. For 2007, we will continue improving and evolving the Chrome product and a great deal of work has already been done for version 2.0 (Joyride). Many customers already know this, as we have a very strong committed set of beta testers who see the progress almost on a weekly basis. For example, we are actively working with LINQ and indeed, so are some of our testers.

Although, we focus on three separate areas, they are, in fact, very complementary. For example, we provide a Delphi Hydra sample that displays a managed plugin written in Chrome using .NET 3.0 and WPF! Whatever way you want to maintain and grow your existing applications, we intend to provide the tools to let you choose how to do it. Being able to incorporate managed code into existing Win32 applications provides you with extreme flexibility.

During the past year, there have been other changes within RemObjects Software that are worth mentioning here. First, we increased the size of the development team. Some of the results of this should already be apparent (the number of change items per two months is growing steadily), but you haven’t seen the full benefit yet. It takes time to train up new people to adhere to company standards and fit into the team. We have an excellent team and I’m expecting great things over the coming year.

Next, I would like to talk about support, which can be a thorny subject for small companies. During the year, we revised the way that we provide support, so that it is now managed as a project in its own right. S, and support items are now subject to the same triage and prioritization as other development items. Although this may have slowed some individual answers, it has meant that our overall service has improved greatly, allowing us to provide help where it is most needed. One thing I should mention here is that it will help us greatly (and indirectly benefit all customers) if you choose to post a question on the newsgroup OR send an email, but please never do both. No matter how we receive bug reports, they go into the same database and doubled reports just cost us time in recognizing and handling duplicates. Thank you for your cooperation on this.

Changes have also occurred on our website, with the introduction of our new DevCenter (available at www.remobjects.com/devcenter). DevCenter provides a portal to all the latest resources available and one centralized place to see new articles, videos and FAQs as we produce them. Customers who have purchased products from us recently will already have seen our new shopping cart system, which we have brought in-house. This has enabled us to produce a more personalized shopping experience. At the same time, the revised My RemObjects page (see www.remobjects.com/myro – login required) allows you to see the exact status of your licenses and provides easy access to downloads, licenses and upgrade options.

Documentation is another thorny subject, especially for me. During 2006, we have made some considerable progress, notably with the complete rewrite of the RemObjects SDK and Data Abstract help files, but we know and acknowledge that this is our weakest area. We do intend to improve, though. , and this is myMy personal goal for 2007 is to spend more time on documentation. In 2006, my time was virtually consumed by project management, but that should change now as the majority of my work in that area has been automated using our internal BugClient application (it sure helps having products like RO/DA/HY at your disposable for implementing such a project). If you are interested, see the blog series I’ve just started which will describe how the automation was achieved (http://blogs.remobjects.com/blogs/mikeo/).

So, what will 2007 bring for RemObjects Software? The first event on our radar is our internal technical conference being held in Berlin early next month. It is worth noting that the Super TCP Channel and the Hydra 3 cross-platform technology were conceived at a similar conference at the beginning of this year.
During the latter part of 2006, we have been developing our ROFX products in two parallel versions: 4.0.x and 5.0. We have already done a great deal of preparatory work for 5.0. In our conference, we intend to schedule the major items over 2007 and beyond so that we can provide you with a steady stream of enhancements every two months.

Nothing is finalized though and this is the perfect time for you to tell us how you would like any or all of our products to develop. We can’t promise that we will adopt all ideas though, but we can promise they will be given serious consideration. In particular, we are really interested in the type of help needed with expanding or moving your Delphi projects to .NET.

Another thing we will be discussing at our conference is whether ROFX 5.0 will continue to support Visual Studio .NET 2003. We are almost decided that it won’t, because it appears that the vast majority of .NET users have moved to .NET 2.0 and Visual Studio 2005 by now. But if we hear from a significant number of customers that you are unable to upgrade to VS2005, we may decide to keep VS2003 support (reluctantly, as this would cost us time and prevent us from using shiny new technology).

Before I finish, I would like to share some personal thoughts about our industry as a whole. I see a lot of comments about “why do we want or need .NET” and similar negativity. It so reminds me of when Windows took over from DOS. At that time there were naysayers, albeit at a lower level because the benefits were more obvious and available quicker. I believe though that the various technologies coming from Microsoft (WPF, .NET3, LinQLINQ, Cider etc) will soon have the same impact. Once there are some glossy (or glassy perhaps) applications out there, today’s user interfaces will be seen to be lacking. That is why I’m pleased to be working at a company like RemObjects, which embraces the new and provides its customers with early adoption of new technologies. End of my personal soapbox.

Well, that’s about it for my first report. Have you found it useful or interesting? If so drop me a line (mikeo@remobjects.com) with comments or suggestions. The more feedback I get, the sooner I will write another one.

Finally, I wish you all happy holidays and a prosperous 2007.

Mike Orriss
General Project Manager
RemObjects Software
http://www.remobjects.com

From XLS to BC5 – Part 1

December 3, 2006 in RemObjects

This is the first of a rambling series of articles that details the evolution of a home-grown project management system. It is not intended to instruct or to promote any particular system, although knowing the problems met and mostly overcome may prove useful.

A disclaimer is needed here: these articles are all my own work and are personal views only. It is hoped that I will write nothing that upsets RemObjects Software, but only time will tell!

This article merely sets the scene and shows why I produced a spreadsheet in the first place. The story starts just over a year ago, when I assumed the role of General Project Manager at (RemObjects Software). Before then, I had a minor role at the company (which suited me fine, having had serious health problems).

The company at that time were having problems (I feel safe to say that now, after this length of time). These were the normal problems faced by a small start-up company having to switch to maintaining existing applications in parallel with creating new ones. Although a lot of work was being done, it wasn’t properly focused, resulting in us missing promised ROadmap targets. This, coupled with personnel changes, meant that some serious project management was needed.

The main tool in use was the Mantis bug management system and there were actually three separate databases (and I added another, but that comes later). At the time, I wasn’t a big fan of Mantis, but my respect for the system has grown considerably over the past year.

Time for a little digression. I have a love/hate relationship with certain software applications. Some I gel with instantly and others cause me no end of grief. Whether I love or hate something doesn’t always seem to match the experiences of others. For example, I loved Virtual Access and only switched to the XanaNews newsreader when development of the former virtually (sorry!) ceased. In contrast, I have had nothing but bad memories with VMWare, which others seem to love. A lot depends, in my view, on why and how you are using an application. When you *have* to use one, the learning curve can definitely put you in the ‘hate it’ category.

Although I didn’t hate it, I didn’t love Mantis either, particular when I tried to use it for project management. It was fine for viewing and updating single issues, but I found it far too slow and restricted when trying to view at a management level. So, I needed a proper project management tool. In the past, I have had great success with bespoke tools such as Microsoft Project and various time recording tools, but I did not feel these were an option at the time. In view of the fact that we were behind schedule, I needed to use a system that did not impact upon the developers (apart, of course, from making them more productive in terms of the items that were most important).

So I decided to create a spreadsheet as a temporary solution (where have you heard that before?) where I could keep the important details from the three databases in a single place, thus allowing me to create consolidated reports.

That’s it for this first article. Subsequent articles will cover how this spreadsheet evolved into an application known internally as BC5.

Why upgrade to .NET?

September 23, 2005 in .NET, Delphi

This is a serious question: why should developers switch to using .NET? There are two things to consider here: existing and new applications.

First, existing applications: what gains are there to be had from converting your codebase? What can a .NET version of your product do above what is already being done? Very little I’d wager. In theory, upgrading will provide all the benefits of .NET but, in practice, this is extremely unlikely if history is any guide.

Although it may not be so obvious at the moment, .NET will provide a far richer development environment than Win32. Signs of this are already showing: LINQ taking generics to the next level, richer graphic experience as shown in the PDC keynote etc. So you decide that upgrading is worth it to stay cutting edge to make use of the latest developments, but will you be able to make FULL use of them? Some (many?) .NET enhancements require facilities just not available to Win32.

I well remember a similar period in the DOS to Win32 conversion. Anyone still use WordPerfect? A classic example of trying to fit a DOS application into a Win32 world. It looks obvious now that it was doomed to failure. It didn’t look obvious then, no more than trying to fit the square Win32 into the round .NET looks now!

New applications are a different story. I see no point in writing new Win32 applications when .NET is the future. I go further: you need to write in a language specifically designed for .NET, not a hybrid providing upward compatibility with Win32. This is where Borland, IMO, have made a huge mistake with Delphi. Providing an upward path from Win32 is a short term solution to a long term non-issue.

Seeing this gap in the market is what caused us to create Chrome. It is already obvious that Chrome sprints while Delphi for .NET walks! One of Delphi’s biggest advantages over the years has been the VCL but that has turned into a heavy anchor! Conversion of the library to handle all the new .NET paradigms will inevitably cost many man hours and for what? It was the market leader a few years ago but the tools and components in Visual Studio have caught up (at least) and have the advantage of being multi-language.

In conclusion, I would recommend leaving existing applications where they are and, if the new facilities make the case, rewrite in a .NET language. For new applications, it is simple: write in a .NET language and if you like Pascal, use Chrome!

Note: for a detailed discussion on the benefits and reasons for using Chrome see http://www.chromesville.com?ch04