Abstract:
In the present study, a systematic investigation on the effect of Fe content on the Hall-Petch coefficient of
(CoCrMnNi)100-x Fex (x =
20, 40, 50, and 60) medium- and high-entropy alloys (M/HEAs) was carried out. The
cold-rolled alloys were annealed at 900
◦
C and 1000
◦
C between 3 min and 10 h for recrystallization. Scanning
electron microscope with a backscattered detector was used to obtain micrographs of recrystallized specimens for
grain size calculation. Tensile testing was used to evaluate the mechanical properties of the alloys. The microstructure
showed
that
regardless
of
the
alloy
composition,
the
grain
size
was
approximately
similar
when
subjected
to
the
same
heat
treatment
condition.
Moreover,
all
the
alloys
obeyed
the
classical
Hall-Petch
relationship.
Friction
stress
(solid
solution,
SS
strengthening)
decreased
with
an
increase
of
Fe
content,
which
was
attributed
to
weak
lattice
distortion
caused
by
the
reduction
of
the
atomic
size
misfit.
The
Hall-Petch
coefficient,
which
represents
grain
boundary
(GB)
strengthening,
also
decreases
as
the
Fe
content
increases.
A
linear
relationship
between
intrinsic
stacking
fault
energy
and
Hall-Petch
coefficient
was
found
not
to
exist.
However,
it
is
proposed
that
the
monotonic
decrease
of
the
Hall-Petch
coefficient
is
directly
related
to
the
unstable
stacking
fault
energy
(γUSFE).
As
a
result,
an
increase
of
Fe
content
in
(CoCrMnNi)100-x
Fex alloy system leads to an increase of γUSFE,
which in turn weakens GB strengthening (Hall-Petch coefficient). Moreover, HEAs and MEAs with higher Fe
content tend to have low yield strength due to weak contributions from both SS and GB strengthening. Therefore,
to design superior MEAs and HEAs with enhanced strength, the choice of principal elements and their respective
contents is imperative for an optimized contribution from both SS and GB strengthening mechanisms.