A coding comparison query enables you to compare coding done by 2 users or 2 groups of users.
It provides two ways of measuring inter-rater reliability or the degree of agreement between the users: through the calculation of the percentage agreement and Kappa coefficient.
Percentage agreement is the number of units of agreement divided by the total units of measure within the data item, displayed as a percentage.
Kappa coefficient is a statistical measure which takes into account the amount of agreement that could be expected to occur through chance. For more information, connect to the internet and check out this Wikipedia article on Cohen's kappa.
In Navigation View, click on Queries.
On the Main toolbar, click the New button:
| |
|
Click the Coding Comparison Query in This Folder option.
The Coding Comparison Query dialog box is displayed.
In the field Compare coding between, click Select to choose users to include in user groups A and B.
In the At field, choose the nodes to compare. Click Select to choose specific nodes.
In the Scope field, choose the sources to query. Click Select to choose specific sources
By default, the Display Kappa coefficient and Display percentage agreement boxes are selected for display. You can choose not to display one of these options by clearing the required box.
If you want to save the query, click the Add to Project checkbox at the top of the dialog. Enter a name and description in the General tab
Option |
Description |
|
|
Query type |
Displays the type of query you are creating. You cannot change the contents of this field. |
Name |
Enter a name for the query. |
Description |
If required, enter a description of the query. |
Location |
Displays the folder that contains the query. You cannot change the contents of this field. |
Created |
Displays the date and time the query was created. You cannot change the contents of this field. |
Modified |
Displays the date and time the query was last modified. You cannot change the contents of this field. |
To save the query properties without running the query, click OK.
To run the query, click the Run button.
Right-click:
|
The results include a Source Length column which provides the measure of different sources in the following ways:
Documents, Memos and Externals = number of characters
Media file = duration in minutes/seconds/10ths of a second
Picture = the total number of pixels expressed as height multiplied by width
If you have opted to display the Kappa coefficient and the percentage agreement, the results will show a column on Kappa value and several columns showing the percentage of agreement and non-agreement between the User group A and User group B.
If the users are in complete agreement then the Kappa value or κ = 1. If there is no agreement among the raters (other than what would be expected by chance) then the Kappa value or κ ≤ 0.
The percentage agreement columns indicate the following values:
Agreement Column = sum of columns A and B and Not A and Not B
A and B = the percentage of data item content coded to the selected node by both Project User Group A and Project User Group B
Not A and Not B = the percentage of data item content coded by neither Project User Group A and Project User Group B
Disagreement Column = sums of columns A and Not B and B and Not A
A and Not B = the percentage of data item content coded by Project User Group A and not coded by Project User Group B
B and Not A = the percentage of data item content coded by Project User Group B and not coded by Project User Group A
|
You will be able to print out or export the results of the coding comparison query but you will not be able to save the results within the project. To view the content that has been coded, right-click on a selected row. You can select to Open Node or Open Source to review the coding in detail. |
|