ChatGPT in medical practice, education and research: malpractice and plagiarism

Editor – We found that the recent article on early applications of ChatGPT in medical practice, education and research very interesting.1 Authors utilising ChatGPT professionally for academic work should exercise caution, according to Sedaghat, who stated that it is unclear how ChatGPT handles hazardous content, false information or plagiarism.1 Sedaghat pointed out that while ChatGPT can make radiological reporting simpler, there is still a chance of inaccurate statements and missing medical information.1 Sedaghat came to the conclusion that while ChatGPT has the potential to alter medical practice, research and education, it still needs refinement before it can be used frequently in the field of medicine.1
It is acknowledged that the ChatGPT is a helpful AI tool and that it may one day be advantageous in research, education and medical practice. But as Sedaghat pointed out, the key issue is the accuracy of the data produced. Furthermore, the issue of malpractice and plagiarism should be addressed.2 All processes remain the user's responsibility. Using a computational tool is not a negative thing, but using it incorrectly is unethical.2 A possible unethical use is employing ChatGPT to write an article without input from the user for primary content drafting, validation and final approval.2
We believe that it is critical to continue to develop the current version of ChatGPT in order to make it more useful in the future. To prevent any unintended malpractice or misconduct, it is also necessary to review and reformat the specific code of conduct for using ChatGPT in medical practice, teaching and research.
- © Royal College of Physicians 2023. All rights reserved.
References
Article Tools
Citation Manager Formats
Jump to section
Related Articles
- No related articles found.
Cited By...
- No citing articles found.