Abstract

We validated the North American Spine Society (NASS) outcome-assessment instrument for the lumbar spine in a computerised touch-screen format and assessed patients’ acceptance, taking into account previous computer experience, age and gender.

Fifty consecutive patients with symptomatic and radiologically-proven degenerative disease of the lumbar spine completed both the hard copy (paper) and the computerised versions of the NASS questionnaire. Statistical analysis showed high agreement between the paper and the touch-screen computer format for both subscales (intraclass correlation coefficient 0.94, 95% confidence interval (0.90 to 0.97)) independent of computer experience, age and gender. In total, 55% of patients stated that the computer format was easier to use and 66% preferred it to the paper version (p < 0.0001 among subjects expressing a preference). Our data indicate that the touch-screen format is comparable to the paper form. It may improve follow-up in clinical practice and research by meeting patients’ preferences and minimising administrative work.

The North American Spine Society (NASS) outcome assessment instrument for the lumbar spine was developed as a disease-specific, comprehensive, self-reporting instrument for standardised assessment in patients with low back pain.1 It consists of 17 questions covering pain, function and neurogenic symptoms. The outcome questions on pain and function are based on the Oswestry disability index.2 The questionnaire has been validated in a German version and belongs to a proposed core set of instruments for research on low back pain.3,4 Normal values are available from the general population for comparison of subpopulations.5 In an attempt to minimise the administrative burden of paper-based formats, computerised versions of several outcome instruments have been developed.6–,9 There are advantages in terms of the acquisition of data, analysis, easy storage and retrieval.10,11 We were interested in patients’ acceptance of such a method. The touch-screen format tested in our study addresses each question visually as a cartoon, written and spoken by a professional to the patient. Only one question is presented at a time in large letters, which may be especially helpful in older persons.

We have compared the paper format of the NASS questionnaire with the new touch-screen computer format in different age groups. We hypothesised that the latter would be as reliable as the paper version and that patient acceptance would be high because of its easy-to-use application and audiovisual presentation.

Patients and Methods

Fifty consecutive patients who were about to undergo surgery because of symptomatic and radiologically-proven degenerative disease of the lumbar spine were invited to complete both the paper format and the computerised touch-screen format of the NASS in a validated German version.3 There were 31 women and 19 men with a mean age of 61.5 years (20 to 88). Of these, 33 had spinal stenosis and 17 had degenerative disc disease. In all patients the assessment was performed on the day of admission to hospital. Details of the patients are given in Table I. Each had to be able to understand the German language.

View this table:
Table I.

Details of the patients by number and percentage when applicable

The patients completed either the paper or the computerised form first, according to a block randomisation. The mean interval between completing both versions was 4.5 hours. Patients were not able to see their previous scores.

Touch-screen format of the NASS.

For the touch-screen computer format, audio and visual cues were presented on the 34.3 cm diagonal screen of a portable personal computer (PC). The questions were presented in a written form as well as read aloud by a speaker. In addition, the topic of the questions was represented in a cartoon. Only one question was presented at a time. The questions were answered by touching the square of the corresponding answer on the computer screen. The patient was able to move one question backwards or forwards as well as to skip one question. The software was developed by a private programming company (H-Informatik, Aarau, Switzerland) based on Microsoft Windows (Microsoft Company, Redmond, Virginia). This data-capturing method has been validated for the WOMAC osteoarthritis index.9

Assessment of patients’ preferences.

After completing both formats the patients were asked to complete a short multiple-choice supplementary questionnaire (four items) on their previous computer use and preferences in regard to both formats. The following questions were asked:

  1. Which format was easier to use? – paper/PC/ undecided;

  2. Which format do you prefer? – paper/PC/undecided;

  3. Do you have experience with computers? – no/in private life/professionally; and

  4. Do you have experience with the internet? – yes/no.

Statistical analysis.

Random assignment of which format (paper or computer) was applied first was performed in blocks of four. Descriptive statistics included the mean of the aggregated scores, the standard deviation (SD) and the mean difference between the scores of the paper and the computerised format. A paired t-test was used to analyse the difference between formats. Agreement (reliability) was assessed by the intraclass correlation coefficient (ICC). Subgroup analysis was performed according to gender and age (< 65 years or ≥ 65 years of age). Ease of use, and the preference of subjects who expressed one, were compared by a two-tailed comparison test. The data were analysed using SAS version 8.1 (SAS, Cary, North Carolina) and SPSS version 11 (SPSS Inc, Chicago, Illinois); p values < 0.05 were regarded as significant.

Results

The two original subscales were created and compared for both formats. These were pain/disability in the last week and neurogenic symptoms in the last week. All the aggregate subscales scores were transformed onto a 0 to 100 scale, where 0 corresponded to maximum symptoms/disability and 100 to minimum or no symptoms/disability. Two patients with three missing items on the pain/disability subscale were excluded from that subscale. Similarly, one patient with three missing questions on the neurogenic sub-scale was excluded from that subscale. Daltroy et al1 recommended the exclusion of patients with missing items namely, two or more on the neurogenic and three or more on the pain/disability subscale.

The means and SD of the scores for the paper and computerised formats and the mean differences between formats are shown in Table II. There was no consistent variation between the electronic and paper versions although a small but significant difference was found for the pain/disability subscale (p < 0.01). The intraclass correlation coefficients (ICC) were very high at 0.94 (95% confidence interval (CI) 0.90 to 0.97) for both subscales, confirming very good agreement formats in both subscales.

View this table:
Table II.

Comparison of paper and computer touch-screen versions of the NASS instrument

If agreement between the formats was assessed when stratified by previous computer use, we found that the ICCs between the paper and computer formats were similar for patients with and without computer experience. The ICCs for those with previous computer experience were as follows: pain/disability subscale 0.95 (95% CI 0.88 to 0.98) and neurogenic subscale 0.97 (95% CI 0.93 to 0.99). Those for patients without previous computer experience were as follows: pain/disability subscale 0.93 (95% CI 0.83 to 0.97) and neurogenic subscale 0.93 (95% CI 0.81 to 0.97).

Of the 50 patients, 26 (52%) were 65 years or older. If agreement between the formats was assessed when stratified by age, we found that the ICCs between the paper and the computerised formats were similar for subjects younger than 65 years and those aged 65 years or older. The ICCs for subjects aged younger than 65 years were as follows: pain/disability subscale 0.96 (95% CI 0.92 to 0.98) and neurogenic subscale 0.97 (95% CI 0.93 to 0.99). Those for subjects 65 years or older were as follows: pain/disability subscale 0.91 (95% CI 0.78 to 0.96) and neurogenic sub-scale 0.90 (95% CI 0.78 to 0.96).

Of our patients, 19 (38%) were men. If agreement between the formats was assessed when stratified by gender, we found that the ICCs between the paper and the computerised formats were similar for men and women. The ICCs for men were as follows: pain/disability subscale 0.96 (95% CI 0.86 to 0.97) and neurogenic subscale 0.97 (95% CI 0.91 to 0.99). Those for women were as follows: pain/ disability subscale 0.94 (95% CI 0.87 to 0.97) and neurogenic subscale 0.93 (95% CI 0.87 to 0.97).

The absolute scores for both subscales obtained from our patients were significantly lower than the normal values from a general population (Table II).5 This supports the external validity of the assessment instrument showing that subjects with back problems have lower scores than healthy control subjects.

Figure 1 gives the results of the exit interview. Although 45% of the patients had no experience with computers at all and 66% had no experience with the internet, 55% stated that the computerised format was easier to use, only 2% voted for the paper format and 43% were undecided. Overall, 66% of subjects preferred the computerised format, while 8% preferred the paper format and 26% were undecided. There was a significant difference (p < 0.0001) between those who expressed a preference about ease of use and the overall preference between the computerised and paper formats.

Fig. 1a

Fig. 1b, Details of the exit interview in regard to the questions a) which format was easier to use? and b) which format do you prefer? (n = 47; * p < 0.0001 for subjects who stated a preference).

Discussion

The NASS lumbar outcome assessment instrument is a valid and reliable tool for measuring the outcome in patients with low back pain. Our aim was to assess the validity of a computerised format as well as its ease of use and acceptability. Our results show that a computerised touch-screen format is as reliable as the original paper format. The high ICC of 0.94 for both subscales indicates that an electronic method did not introduce any systematic variation. The magnitudes of the variation in our study (ICCs ranging from 0.78 to 0.97) are comparable with those found in the original paper validation for the test – retest reliability (ICCs of 0.85 to 0.97).1

There is an increasing number of reports on computerised outcome instruments7–,13 and general agreement on the several advantages of the electronic format such as a reduced amount of paper, reduced number of missing items and easier handling, analysis and storage of data. Saleh et al12 carried out a comparison of the Western Ontario McMasters Universities osteoarthritis index and SF-36 outcome measures using palm-top computers and paper surveys on 96 patients. According to their results, the two methods were comparable, but there was some concern that the more poorly-educated and older patients would not be able to complete a palm-based questionnaire. Our results show no effect of age with ICCs, the results being similar in those aged below and above 65 years. One possible explanation for the difference between our experience and that of Saleh et al12 may be the audiovisual approach of our computerised questionnaire, which may improve the understanding of each question. In addition, each question is presented separately on the computer screen as well as simple ‘go forward’ and ‘go back’ commands. With our approach it did not seem to matter that almost half of our study population had no earlier experience with computers and that two-thirds were unfamiliar with the internet, as stated in the supplementary questionnaire.

Despite the lack of previous computer experience in a considerable number of individuals, most still preferred the computerised format. This supports the high degree of acceptability with electronic questionnaires, independent of age, gender and familiarity with computer technology.7,8,14

There were missing items on the electronic version because the software allows questions to be skipped. This respects a patient’s rights to leave a question unanswered, but also creates an opportunity to accidentally omit a question. The solution, as proposed by Buxton et al,10 may be to adapt the software so that skipped items are presented later for a second time. With this method they were able to improve their response rate to 100% for a computerised format of the Quality of Life Questionnaire-Core30 of the European Organisation for Research and Treatment of Cancer.

Recently, the necessity of a disease-specific outcome instrument has been questioned, because of the burden of the administration of the data.15 Any instrument can be implemented on a computer system, so that the administrative burden is not substantially changed. In addition, the computerised system allows a real time, immediate display of results. The details for an individual patient with multiple follow-ups may be seen at once and compared with a normal age-matched control group. This is also possible with the traditional paper format but requires additional equipment and personnel.

Finally, the electronic assessment may simplify quality assurance. A recent Californian survey showed that in more than 80% of surgical patients no outcome is measured, and highlighted the importance of systematic research.16 The authors concluded that no objective data for consumers and health-care sponsors are available to allow for the quality control of different interventions and different institutions. In a healthcare system in which the competition between different providers is becoming more intense, surgeons are well advised to collect hard data on the efficacy of their treatment. Resources will only be given when both quality and efficacy have been demonstrated. Computerised questionnaires reduce the administrative burden and can therefore help to gather data more easily.

Footnotes

  • The authors would like to thank The Foundation to Support Education and Research of the Voluntary Academic Society Basel, the Swiss National Foundation and the Novartis Foundation for Medico-Biological Research for their help in supporting this study.

  • No other benefits in any form have been received or will be received from a commercial party related directly or indirectly to the subject of this article.

  • Received April 7, 2004.
  • Accepted July 13, 2004.

References

View Abstract