with a CoCr or titanium-alloy-based implant if the situation demands it. Stainless steel
transient usage is because stainless steel does not resist corrosion and is not as strong
as titanium alloys [11]. Alloying stainless steel with other elements does improve cor-
rosion resistance but it is still outclassed by many other materials.
Titanium alloys have excellent material properties and high corrosion resistance [24–
26]. Like CoCr, titanium naturally forms an oxide when inside the human body that
protects it from corrosion resistance [27]. Grade four commercially pure titanium (CP-
Ti) and Ti-6Al-4V (Ti-64) are the two most commonly used forms of titanium for
medical implants [24,28]. Titanium’s success as an implant comes from its superiority
in corrosion resistance, biocompatibility, and high strength to weight ratio [23,28–33].
However, titanium suffers from lower wear resistance than CoCr and more often results
in wear debris in patients [11]. The debris poses a considerable risk to patients since
vanadium is highly cytotoxic [34]. Many substitutions to vanadium have been made to
create a safer alloy. For example, Ti-6Al-7Nb substituted out the cytotoxic vanadium
for the much safer alternative of niobium which had the added benefit of increasing
corrosion resistance [34]. Ti-5Al-2.5Fe is another alloy created for a similar purpose.
Ti-5Al-2.5Fe is an improvement on Ti-6Al-4V in almost all fields [35]. Surface modi-
fication of titanium alloys have also been reported to improve the corrosion resistance,
such as the use of SiO2oxide for coating the Ti6Al7Nb alloy and the CP-Ti titanium
(Grade 4) [36,37]. However, even with all these improvements, all Ti-alloys still suffer
from the same problem that all implants suffer from: the constant friction forces gener-
ate considerable amounts of metal debris inside the body over a long time [35,38], and
new modifications of the alloy composition to improve wear resistance is desirable.
This report aims to present a computational method for optimizing the compo-
sition of titanium alloys for increased wear and corrosion resistance. The presented
method and alloy optimization results serve a two-fold purpose: 1) To provide a high-
throughput method for down-selecting alloys with improved corrosion and wear resis-
tance, which can be utilized to guide the experimental design of orthopedic implants.
2) To deepen our understanding of the effect of different alloying elements on increas-
ing the mechanical compatibility, wear resistance, and corrosion resistance of alloys.
The presented computational approach is used for assessing the stability, mechanical
biocompatibility, wear resistance, and corrosion resistance of several titanium alloys.
We adopt a two-tier approach: In the first tier, we use fundamental atomic and elec-
tronic attributes for a high-throughput selection of alloying elements that increase wear
and corrosion resistance. In the second tier, we use the CALPHAD databases and tools
integrated within the ThermoCalc software to predict the thermodynamic stability, cor-
rosion resistance, wear resistance, and mechanical compatibility of the selected ternary
titanium alloys. We show the improvement of corrosion resistance among the selected
alloy based on their reduced Pilling-Bedworth (PB) ratio, defined as the volume of
formed oxide phase(s) divided by the volume of alloy phase(s) that formed the oxides.
Enhanced wear resistance is assessed based on the increased hardness of the selected
alloys compared to the benchmark system of Ti-6Al-4V (mole%).
In section 2, we elaborate on the methodology used in this work. In section 3,
we utilize the methodology to select ternary titanium alloys with improved wear and
corrosion resistance. We study the stability, biocompatibility, wear resistance, and cor-
rosion resistance of the selected alloys using the CALPHAD approach. In section 4, we
3