View Issue Details

IDProjectCategoryView StatusLast Update
00032261 - BacklogBugpublic2019-06-16 17:37
ReporterWA9PIEAssigned ToWA9PIE 
Status assignedResolutionopen 
Summary0003226: Awards definition builder (butchers) the output and changes things that weren't intended to change
DescriptionThere are a number of new awards I need to build or revise. It's impossible to do this with confidence because the awards definition builder changes things that you didn't intend to change.

Essentially, the awards definition builder is a UI that enables the user to create or modify an XML document used for awards reporting.

I am very proficient with building and managing the awards. But the problem described here is getting worse. I've been doing my edits in XML Notepad... then importing and exporting them to get the final version done. But the changes I need to make are becoming complex enough that it's hard to do it by hand like that.
Steps To Reproduce- Open Logbook (probably with a reasonable amount of QSOs)
- Open the Awards Tracking tab
- Go to the ARRL DXCC awards program
- Take a screenshot of the totals shown for the award
- Click on the Definitions button
- Find the ARRL DXCC (HRD) award; copy it by clicking the "Copy" button
- Scroll down and find the copy
- Select it and click Modify
- In the center section, scroll down to "Satellite" and double-click it
- In the resulting dialog box, replace the item in the "Field 1" section with the following:
>> Propagation Mode... Contains... Satellit (yes, I intentionally left off the "e" because that's how it gets saved in the database)
>> At the top where it says "Award to Track", change that to DXCC (Satellite)
- Click Ok... through the resulting dialog boxes to save the changes and get back to the Awards Tracking tab
- Find the newly created DXCC award and compare the totals in the "Worked" column.

They will not match what the correct totals are in the screenshot. In my machine, the Digital award changes so it's gone from excluding non-digital modes... to where it is only including non-digital modes (so it's exactly the opposite type of match).
TagsNo tags attached.
TestingNot Started



2019-03-12 19:12

administrator   ~0007678

There are many moving pieces here, so I think it's important to try to simplify so we have a shot at making a good diagnosis and therefore a good fix.

The instructions given with this issue suggest running a stock award definition, observing the results (the needed contact counts); then copying that definition, modifying it, and observing the resulting counts of the modified definition. By comparing them, we see unexpected changes.

It's possible that the results are different because of the change in the definition (then, by design); because the definition inadvertently changed, because the queries for the definition changed unexpectedly, or because the persistence of the award changed the award definition unexpectedly.

To narrow this down, I've performed a slightly different test:

1) Fire up the logbook; load a populated database
2) Use the "Award Tracking" button in the database view toolbar to open an Award Tracking tab
3) Use the "Definitions" button on the Award Tracking tab to open the definition dialog
4) Select the "ARRL DXCC (HRD)" award
5) Press the "Copy" button to create a copy. Make a note of the new name (this is difficult; the copy is created and added to the list without any prompt or indication of where the new item landed)
6) Select the "ARRL DXCC (HRD)" award again. Use the "Copy" button to make another copy. Again, note the new name.

7) Select the new definition from Step #5.
8) Press the "Export" button to export the definition.
9) Double click the Step #6 definition to edit it.
10) Edit it with the steps given above:

10a) In the center section, scroll down to "Satellite" and double-click it
10b) In the resulting dialog box, replace the item in the "Field 1" section with the following:
10c) Propagation Mode... Contains... Satellit (yes, I intentionally left off the "e" because that's how it gets saved in the database)
10d) At the top where it says "Award to Track", change that to DXCC (Satellite)
10e) Click Ok to return to the "Award 2 Definitions" dialog

11) Make sure the Step #6 definition is selected in the list
12) Press the "Export" button to export the definition to a new file.

At this point, we have a copy of the original definition and the modified definition. We can compare them:

13) Close the editor dialog, return to the Award tracking tab.
14) The original "ARRL DXCC (HRD)" award is selectd. For me and the database I'm using, the "Digital" row is "126,66,60,214,32,0,34".
15) Select the modified award, saved at step 10e)
16) Check out the digital row. For me, "336, 323, 13, 4, 33, 0, 290".
17) Select the unmodified copy, from Step #7.
18) Check the digital row again. For me: "126,66,60,214,32,0,34".

The unmodified copy matches the original stock definition. The modified definition matches neither.

We've exported the unmodified copy, and the modified copy. So let's compare them.

The attached "Mantis3326Comparison" spreadsheet highlights the differences in the unpacked XML fields. Some differences are expected, others don't seem to be. In particular, the "Matches_count" and "Matches_value_n" fields are quite different, and I don't think we expect them to be.

I'll drill into the code to figure out what these fields mean ...

Mantis3326Compare.xlsx (13,025 bytes)


2019-03-13 13:51

administrator   ~0007680

I'm posted my own log here for testing purposes - "Google Drive\Team Drives\HRD Software\Logs (customers)\wa9pie"


2019-03-13 13:56

administrator   ~0007681

I can probably explain what the fields mean to a degree. We can discuss as needed. I'm pretty familiar with the layout of this XML and the definitions because I end up editing them all the time.

I'm attaching a file that contains the match enumerations. One would hope that this is also in the code.

Your experience above explains exactly the problem.

AwardMatches.txt (464 bytes)
enum eMatchTypes


// Added for numeric.

MATCH_GT = 202
MATCH_LT = 203

// Dates.

AwardMatches.txt (464 bytes)


2019-03-13 20:12

administrator   ~0007684

Indeed, the numbers are for the "match" fields. What I need to figure out is why the match lists get compressed. It seems like there should be one match type per predicate specified in the award definition. That seems to be true in the original definition. After the edit and save, though, there are fewer match types than match field definitions. Why is that?

There is code that active reduces the match type list. That code seems very strange. It starts with there being an array of arrays; a two-dimensoinal array of match types. Why is that necessary? For each field, I need one match type ... which sould like a one-dimensional array.

My experiment shows that either the editing code or the serialization code is at fault here.

But I quickly observe that we're in the same state that we are with so many othe parts of the product: there's no documentation here for the desired or intended behaviour. The code has a mysterious shape; maybe it's necessary, maybe it's not. But since there's no documentation, I've got no source of truth for what should be happening. I don't know what an award definition is fundamentally meant to look like, or what intention it encapsulates (or how), so I don't know what to fix; I don't know what changes I can make without making matters worse.

Until that shortcoming is remedied, I don't know how to make progress on this issue.


2019-03-24 23:49

administrator   ~0007733

Are you referring to the intended behavior of the awards definition builder? If so, I can probably describe the components of the awards builder UI. I can probably describe the parts of the awards XML. But I have no idea what was in the mind of the developer who coded this.

It could be that the awards function may be sloppy and could need to be completely replaced. I don't know. But it does work... when the awards definition doesn't get butchered by the awards def builder.


2019-05-30 16:51

administrator   ~0007959

Switched to "enhnacement" per the team call on 2019-05-30


2019-06-15 12:45

administrator   ~0008079

I don't think it matters what was in the mind of the other developers. What matters -- and what's necessary to make a fix -- is a description of how this code is meant to behave when it is working correctly. Maybe it's clearer if I ask: what is in the mind of the user when they work with this feature? Or: what was in the mind of the designer when they designed this feature?

Issue History

Date Modified Username Field Change
2019-03-05 00:46 WA9PIE New Issue
2019-03-05 00:46 WA9PIE Status new => assigned
2019-03-05 00:46 WA9PIE Assigned To => K7ZCZ
2019-03-12 19:12 K7ZCZ File Added: Mantis3326Compare.xlsx
2019-03-12 19:12 K7ZCZ Note Added: 0007678
2019-03-13 13:51 WA9PIE Note Added: 0007680
2019-03-13 13:56 WA9PIE File Added: AwardMatches.txt
2019-03-13 13:56 WA9PIE Note Added: 0007681
2019-03-13 20:12 K7ZCZ Note Added: 0007684
2019-03-13 20:12 K7ZCZ Assigned To K7ZCZ => WA9PIE
2019-03-24 23:49 WA9PIE Note Added: 0007733
2019-03-24 23:49 WA9PIE Assigned To WA9PIE => K7ZCZ
2019-05-30 16:51 K7ZCZ Note Added: 0007959
2019-06-15 12:45 K7ZCZ Note Added: 0008079
2019-06-15 12:45 K7ZCZ Assigned To K7ZCZ => WA9PIE
2019-06-16 17:37 WA9PIE Project 3 - Current Dev List => 1 - Backlog