A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility

We consider the work flow in a medical teaching facility, examining the process that involves an initial patient exam by a resident physician, a subsequent conference between the resident and the attending physician, and the attending physician's visit with the patient. We create an analytical...

Full description

Saved in:
Bibliographic Details
Main Authors: Dobson, Gregory, Lee, Hsiao-Hui, Sainathan, Arvind, Tilson, Vera
Other Authors: Nanyang Business School
Format: Article
Language:English
Published: 2013
Subjects:
Online Access:https://hdl.handle.net/10356/98764
http://hdl.handle.net/10220/17654
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-98764
record_format dspace
spelling sg-ntu-dr.10356-987642023-05-19T06:44:43Z A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility Dobson, Gregory Lee, Hsiao-Hui Sainathan, Arvind Tilson, Vera Nanyang Business School DRNTU::Business::Operations management We consider the work flow in a medical teaching facility, examining the process that involves an initial patient exam by a resident physician, a subsequent conference between the resident and the attending physician, and the attending physician's visit with the patient. We create an analytical model of a tandem queue with finite buffer space to analyze the impact of different work prioritization policies on the throughput and the flow time of patients in the facility—measures that influence both the facility's finances and patients' satisfaction. We derive throughput-optimal policies and show that these policies involve dynamic batching. This finding is interesting because our model does not include any setup times, and setup times normally imply batching; rather it is the uncertain service times and the requirement for simultaneous service in the conference step that make batching optimal. The optimal dynamic batching policy is complex, so we consider a simpler static batching policy. We show that, in systems with limited buffer space, large batches can sometimes degrade efficiency by simultaneously increasing flow time and decreasing throughput. However, in general, both flow time and throughput increase with batch size. Flow time increases at a faster rate than throughput, so hospital management may want to consider what batch size is optimal given the value it places on the two measures. 2013-11-15T02:51:59Z 2019-12-06T19:59:26Z 2013-11-15T02:51:59Z 2019-12-06T19:59:26Z 2012 2012 Journal Article Gregory, D., Lee, H. H., Sainathan, A., & Tilson, V. (2012). A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility. Manufacturing and service operations management, 14(4). https://hdl.handle.net/10356/98764 http://hdl.handle.net/10220/17654 10.1287/msom.1120.0380 en Manufacturing and service operations management
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Business::Operations management
spellingShingle DRNTU::Business::Operations management
Dobson, Gregory
Lee, Hsiao-Hui
Sainathan, Arvind
Tilson, Vera
A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility
description We consider the work flow in a medical teaching facility, examining the process that involves an initial patient exam by a resident physician, a subsequent conference between the resident and the attending physician, and the attending physician's visit with the patient. We create an analytical model of a tandem queue with finite buffer space to analyze the impact of different work prioritization policies on the throughput and the flow time of patients in the facility—measures that influence both the facility's finances and patients' satisfaction. We derive throughput-optimal policies and show that these policies involve dynamic batching. This finding is interesting because our model does not include any setup times, and setup times normally imply batching; rather it is the uncertain service times and the requirement for simultaneous service in the conference step that make batching optimal. The optimal dynamic batching policy is complex, so we consider a simpler static batching policy. We show that, in systems with limited buffer space, large batches can sometimes degrade efficiency by simultaneously increasing flow time and decreasing throughput. However, in general, both flow time and throughput increase with batch size. Flow time increases at a faster rate than throughput, so hospital management may want to consider what batch size is optimal given the value it places on the two measures.
author2 Nanyang Business School
author_facet Nanyang Business School
Dobson, Gregory
Lee, Hsiao-Hui
Sainathan, Arvind
Tilson, Vera
format Article
author Dobson, Gregory
Lee, Hsiao-Hui
Sainathan, Arvind
Tilson, Vera
author_sort Dobson, Gregory
title A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility
title_short A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility
title_full A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility
title_fullStr A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility
title_full_unstemmed A queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility
title_sort queueing model to evaluate the impact of patient “batching” on throughput and flow time in a medical teaching facility
publishDate 2013
url https://hdl.handle.net/10356/98764
http://hdl.handle.net/10220/17654
_version_ 1770566488982290432