The longitudinal arch (LA) helps stiffen the foot during walking, but many people in developed co... more The longitudinal arch (LA) helps stiffen the foot during walking, but many people in developed countries suffer from flat foot, a condition characterized by reduced LA stiffness that can impair gait. Studies have found this condition is rare in people who are habitually barefoot or wear minimal shoes compared to people who wear conventional modern shoes, but the basis for this difference remains unknown. Here we test the hypothesis that the use of shoes with features that restrict foot motion (e.g. arch supports, toe boxes) is associated with weaker foot muscles and reduced foot stiffness. We collected data from minimally-shod men from northwestern Mexico and men from urban/suburban areas in the United States who wear 'conventional' shoes. We measured dynamic LA stiffness during walking using kinematic and kinetic data, and the cross-sectional areas of three intrinsic foot muscles using ultrasound. Compared to conventionally-shod individuals, minimally-shod individuals had higher and stiffer LAs, and larger abductor hallucis and abductor digiti minimi muscles. Additionally, abductor hallucis size was positively associated with LA stiffness during walking. Our results suggest that use of conventional modern shoes is associated with weaker intrinsic foot muscles that may predispose individuals to reduced foot stiffness and potentially flat foot. As bipeds, humans have evolved dramatically different feet from other primates 1. One of the most distinctive features of the human foot is the longitudinal arch (LA), whose anatomical scaffold is created by the conformation of the tarsal and metatarsal bones, and which is reinforced by numerous soft tissue structures that span the plantar surface of the foot. The LA stiffens the foot under loading, enabling it to function as a propulsive lever during walking and running 2. LA stiffness partly derives from ligamentous structures, including the long and short plantar ligaments, the spring ligament and the plantar aponeurosis, that traverse the plantar surface of the foot longitudinally and act as trusses to resist compressive forces on the LA 3. The intrinsic foot muscles also contribute to LA stiffness by contracting to help control LA deformation during walking and running 4,5 , thereby relieving an unknown proportion of the stress borne by the plantar ligaments. The standing height of the LA on the medial side of the foot is the most commonly used indicator of relative arch height 6. Individuals with exceptionally low LAs while standing are characterized as having flat foot (pes planus). All humans are born with a low arch, and most develop a fully adult configuration of the LA by 10–12 years of life 7. However, roughly 20–25% of adults in the United States and Canada are diagnosed as having flat feet 8–11 , either because they fail to develop a normal height arch or because the arch collapses. Most individuals diagnosed with flat foot possess a so-called 'flexible' flat foot, characterized by substantial eversion of the rear foot during weight-bearing, resulting in a marked drop in LA height 12 , and reduced LA stiffness during walking 13,14. Although this condition is often asymptomatic 12 , in some individuals it causes foot pain and fatigue after long durations standing and/or walking 15. Reduced LA stiffness is also a risk factor for numerous lower extremity musculoskeletal disorders including plantar fasciitis, knee osteoarthritis, tibialis posterior tendinopathy, and met-atarsal stress fracture 11,16–19. Thus, developing strategies to prevent and treat this condition is an important health objective. Despite the high incidence flat feet in the US and other developed nations, many studies report lower rates of flat foot in habitually barefoot or minimally-shod populations 20–28. In one of the largest of these studies, which
Despite substantial recent interest in walking barefoot and in minimal footwear, little is known ... more Despite substantial recent interest in walking barefoot and in minimal footwear, little is known about potential differences in walking biomechanics when unshod versus minimally shod. To test the hypothesis that heel impact forces are similar during barefoot and minimally shod walking, we analysed ground reaction forces recorded in both conditions with a pedography platform among indigenous subsistence farmers, the Tarahumara of Mexico, who habitually wear minimal sandals, as well as among urban Americans wearing commercially available minimal sandals. Among both the Tarahumara (n = 35) and Americans (n = 30), impact peaks generated in sandals had significantly (p < 0.05) higher force magnitudes, slower loading rates and larger vertical impulses than during barefoot walking. These kinetic differences were partly due to individuals' significantly greater effective mass when walking in sandals. Our results indicate that, in general, people tread more lightly when walking barefoot than in minimal footwear. Further research is needed to test if the variations in impact peaks generated by walking barefoot or in minimal shoes have consequences for musculoskeletal health.
Knee osteoarthritis (OA) is believed to be highly prevalent today because of recent increases in ... more Knee osteoarthritis (OA) is believed to be highly prevalent today because of recent increases in life expectancy and body mass index (BMI), but this assumption has not been tested using long-term historical or evolutionary data. We analyzed long-term trends in knee OA prevalence in the United States using cadaver-derived skeletons of people aged ≥50 y whose BMI at death was documented and who lived during the early industrial era (1800s to early 1900s; n = 1,581) and the modern postindustrial era (late 1900s to early 2000s; n = 819). Knee OA among individuals estimated to be ≥50 y old was also assessed in archeologically derived skeletons of prehistoric hunter-gatherers and early farmers (6000– 300 B.P.; n = 176). OA was diagnosed based on the presence of eburnation (polish from bone-on-bone contact). Overall, knee OA prevalence was found to be 16% among the postindustrial sample but only 6% and 8% among the early industrial and prehistoric samples, respectively. After controlling for age, BMI, and other variables, knee OA prevalence was 2.1-fold higher (95% confidence interval, 1.5–3.1) in the postindustrial sample than in the early industrial sample. Our results indicate that increases in lon-gevity and BMI are insufficient to explain the approximate doubling of knee OA prevalence that has occurred in the United States since the mid-20th century. Knee OA is thus more preventable than is commonly assumed, but prevention will require research on additional independent risk factors that either arose or have become amplified in the postindustrial era. arthritis | aging | obesity | mismatch disease | evolutionary medicine O steoarthritis (OA) is the most prevalent joint disease and a leading source of chronic pain and disability in the United States (1) and other developed nations (2). Knee OA accounts for more than 80% of the disease's total burden (2) and affects at least 19% of American adults aged 45 y and older (3). Substantial evidence indicates that knee OA is proximately caused by the breakdown of joint tissues from mechanical loading (4) and inflammation (5), but the deeper underlying causes of knee OA's high prevalence remain unclear and poorly tested, hindering efforts to prevent and treat the disease. Two recent public health trends, however, are commonly assumed to be dominant factors (6, 7). First, because knee OA's prevalence increases with age (8), the rise in life expectancy in the United States since the early 20th century is thought to have led to high knee OA levels among the elderly, with the presumption that, as people age, their senescing joint tissues accumulate more wear and tear from loading (9). Second, high body mass index (BMI) has become epidemic in the United States in recent decades and is a well-known risk factor for knee OA (8), probably because of the combined effects of joint overloading and adiposity-induced inflammation (10). Whether increases in longevity and BMI are responsible for current knee OA levels has never been tested, but this assumption has led many to view the disease's high prevalence as effectively unpreventable, since aging is untreatable, and the high BMI epidemic is intractable (8, 11). One underused yet potentially powerful way to identify and assess the risk factors responsible for current knee OA levels is to examine long-term changes in the disease's prevalence by comparing contemporary with historic and prehistoric populations (12). Epidemiological studies of present day populations are valuable but are limited in their ability to analyze risk factors that are now pervasive but used to be less common. It is difficult to find large samples of living Americans whose lifestyles, including physical activity levels and diet, resemble those of past generations. Although many variables cannot be measured and thus controlled in epidemiological studies of people living in the past, a major benefit of analyzing populations over historical and evolutionary time is to assess known risk factors under different environmental conditions and thus bring to light the effects of risk factors that might not be apparent or testable in modern populations alone. Furthermore, although knee OA is known to be ancient (12), we know very little about changes in its prevalence over time. Low levels of knee OA have been reported for some historic and prehistoric populations (13–17), suggesting that the disease's prevalence has recently increased, but these studies used different diagnostic criteria than those used to diagnose knee OA in living patients, used samples composed mostly of younger individuals, and did not account for BMI, complicating comparisons with modern epidemiological data. Here, we investigate long-term trends in knee OA prevalence in the United States and evaluate the effects of longevity and Significance Knee osteoarthritis is a highly prevalent, disabling joint disease with causes that remain poorly understood but are commonly attributed to aging and obesity. To gain insight into the eti-ology of knee osteoarthritis, this study traces long-term trends in the disease in the United States using large skeletal samples spanning from prehistoric times to the present. We show that knee osteoarthritis long existed at low frequencies, but since the mid-20th century, the disease has doubled in prevalence. Our analyses contradict the view that the recent surge in knee osteoarthritis occurred simply because people live longer and are more commonly obese. Instead, our results highlight the need to study additional, likely preventable risk factors that have become ubiquitous within the last half-century.
Studies of ancient human skeletal remains frequently proceed from the assumption that individuals... more Studies of ancient human skeletal remains frequently proceed from the assumption that individuals with robust limb bones and/or rugose, hypertrophic entheses can be inferred to have been highly physically active during life. Here, we experimentally test this assumption by measuring the effects of exercise on limb bone structure and entheseal morphology in turkeys. Growing females were either treated with a treadmill-running regimen for 10 weeks or served as controls. After the experiment, femoral cortical and trabecular bone structure were quantified with mCT in the mid-diaphysis and distal epiphysis, respectively , and entheseal morphology was quantified in the lateral epicondyle. The results indicate that elevated levels of physical activity affect limb bone structure but not entheseal morphology. Specifically, animals subjected to exercise displayed enhanced diaphyseal and trabecular bone architecture relative to controls, but no significant difference was detected between experimental groups in entheseal surface topography. These findings suggest that diaphyseal and trabecular structure are more reliable proxies than entheseal morphology for inferring ancient human physical activity levels from skeletal remains.
Anthropologists accept that mobility is a critical dimension of human culture, one that links eco... more Anthropologists accept that mobility is a critical dimension of human culture, one that links economy, technology, and social relations. Less often acknowledged is that mobility depends on complex and dynamic interactions between multiple levels of our biological organization, including anatomy, physiology, neu-robiology, and genetics. Here, we describe a novel experimental approach to examining the biological foundations of mobility, using mice from a long-term artificial selection experiment for high levels of voluntary exercise on wheels. In this experiment, mice from selectively bred lines have evolved to run roughly three times as far per day as those from nonselected control lines. We consider three insights gleaned from this experiment as foundational principles for the study of mobility from the perspective of biological evolution. First, an evolutionary change in mobility will necessarily be associated with alterations in biological traits both directly and indirectly connected to mobility. Second, changing mobility will result in trade-offs and constraints among some of the affected traits. Third, multiple solutions exist to altering mobility, so that various combinations of adjustments to traits linked with mobility can achieve the same overall behavioral outcome. We suggest that anthropological knowledge of variation in human mobility might be improved by greater research attention to its biological dimensions.
This study investigates the influence of genetic differentiation in determining worldwide heterog... more This study investigates the influence of genetic differentiation in determining worldwide heterogeneity in osteoporosis-related hip fracture rates. The results indicate that global variation in fracture incidence exceeds that expected on the basis of random genetic variance. Introduction Worldwide, the incidence of osteoporotic hip fractures varies considerably. This variability is believed to relate mainly to non-genetic factors. It is conceivable, however , that genetic susceptibility indeed differs across populations. Here, we present the first quantitative assessment of the effects of genetic differentiation on global variability in hip fracture rates. Methods We investigate the observed variance in publically reported age-standardized rates of hip fracture among 28 populations from around the world relative to the expected variance given the phylogenetic relatedness of these populations. The extent to which these variances are similar constitutes a Bphylogenetic signal,^ which was measured using the K statistic. Population genetic divergence was calculated using a robust array of genome-wide single nucleotide polymorphisms. Results While phylogenetic signal is maximized when K > 1, a K value of only 0.103 was detected in the combined-sex fracture rate pattern across the 28 populations, indicating that fracture rates vary more than expected based on phylogenetic relationships. When fracture rates for the sexes were analyzed separately, the degree of phylogenetic signal was also found to be small (females: K = 0.102; males: K = 0.081). Conclusions The lack of a strong phylogenetic signal underscores the importance of factors other than stochastic genetic diversity in shaping worldwide heterogeneity in hip fracture incidence.
The longitudinal arch (LA) helps stiffen the foot during walking, but many people in developed co... more The longitudinal arch (LA) helps stiffen the foot during walking, but many people in developed countries suffer from flat foot, a condition characterized by reduced LA stiffness that can impair gait. Studies have found this condition is rare in people who are habitually barefoot or wear minimal shoes compared to people who wear conventional modern shoes, but the basis for this difference remains unknown. Here we test the hypothesis that the use of shoes with features that restrict foot motion (e.g. arch supports, toe boxes) is associated with weaker foot muscles and reduced foot stiffness. We collected data from minimally-shod men from northwestern Mexico and men from urban/suburban areas in the United States who wear 'conventional' shoes. We measured dynamic LA stiffness during walking using kinematic and kinetic data, and the cross-sectional areas of three intrinsic foot muscles using ultrasound. Compared to conventionally-shod individuals, minimally-shod individuals had higher and stiffer LAs, and larger abductor hallucis and abductor digiti minimi muscles. Additionally, abductor hallucis size was positively associated with LA stiffness during walking. Our results suggest that use of conventional modern shoes is associated with weaker intrinsic foot muscles that may predispose individuals to reduced foot stiffness and potentially flat foot. As bipeds, humans have evolved dramatically different feet from other primates 1. One of the most distinctive features of the human foot is the longitudinal arch (LA), whose anatomical scaffold is created by the conformation of the tarsal and metatarsal bones, and which is reinforced by numerous soft tissue structures that span the plantar surface of the foot. The LA stiffens the foot under loading, enabling it to function as a propulsive lever during walking and running 2. LA stiffness partly derives from ligamentous structures, including the long and short plantar ligaments, the spring ligament and the plantar aponeurosis, that traverse the plantar surface of the foot longitudinally and act as trusses to resist compressive forces on the LA 3. The intrinsic foot muscles also contribute to LA stiffness by contracting to help control LA deformation during walking and running 4,5 , thereby relieving an unknown proportion of the stress borne by the plantar ligaments. The standing height of the LA on the medial side of the foot is the most commonly used indicator of relative arch height 6. Individuals with exceptionally low LAs while standing are characterized as having flat foot (pes planus). All humans are born with a low arch, and most develop a fully adult configuration of the LA by 10–12 years of life 7. However, roughly 20–25% of adults in the United States and Canada are diagnosed as having flat feet 8–11 , either because they fail to develop a normal height arch or because the arch collapses. Most individuals diagnosed with flat foot possess a so-called 'flexible' flat foot, characterized by substantial eversion of the rear foot during weight-bearing, resulting in a marked drop in LA height 12 , and reduced LA stiffness during walking 13,14. Although this condition is often asymptomatic 12 , in some individuals it causes foot pain and fatigue after long durations standing and/or walking 15. Reduced LA stiffness is also a risk factor for numerous lower extremity musculoskeletal disorders including plantar fasciitis, knee osteoarthritis, tibialis posterior tendinopathy, and met-atarsal stress fracture 11,16–19. Thus, developing strategies to prevent and treat this condition is an important health objective. Despite the high incidence flat feet in the US and other developed nations, many studies report lower rates of flat foot in habitually barefoot or minimally-shod populations 20–28. In one of the largest of these studies, which
Despite substantial recent interest in walking barefoot and in minimal footwear, little is known ... more Despite substantial recent interest in walking barefoot and in minimal footwear, little is known about potential differences in walking biomechanics when unshod versus minimally shod. To test the hypothesis that heel impact forces are similar during barefoot and minimally shod walking, we analysed ground reaction forces recorded in both conditions with a pedography platform among indigenous subsistence farmers, the Tarahumara of Mexico, who habitually wear minimal sandals, as well as among urban Americans wearing commercially available minimal sandals. Among both the Tarahumara (n = 35) and Americans (n = 30), impact peaks generated in sandals had significantly (p < 0.05) higher force magnitudes, slower loading rates and larger vertical impulses than during barefoot walking. These kinetic differences were partly due to individuals' significantly greater effective mass when walking in sandals. Our results indicate that, in general, people tread more lightly when walking barefoot than in minimal footwear. Further research is needed to test if the variations in impact peaks generated by walking barefoot or in minimal shoes have consequences for musculoskeletal health.
Knee osteoarthritis (OA) is believed to be highly prevalent today because of recent increases in ... more Knee osteoarthritis (OA) is believed to be highly prevalent today because of recent increases in life expectancy and body mass index (BMI), but this assumption has not been tested using long-term historical or evolutionary data. We analyzed long-term trends in knee OA prevalence in the United States using cadaver-derived skeletons of people aged ≥50 y whose BMI at death was documented and who lived during the early industrial era (1800s to early 1900s; n = 1,581) and the modern postindustrial era (late 1900s to early 2000s; n = 819). Knee OA among individuals estimated to be ≥50 y old was also assessed in archeologically derived skeletons of prehistoric hunter-gatherers and early farmers (6000– 300 B.P.; n = 176). OA was diagnosed based on the presence of eburnation (polish from bone-on-bone contact). Overall, knee OA prevalence was found to be 16% among the postindustrial sample but only 6% and 8% among the early industrial and prehistoric samples, respectively. After controlling for age, BMI, and other variables, knee OA prevalence was 2.1-fold higher (95% confidence interval, 1.5–3.1) in the postindustrial sample than in the early industrial sample. Our results indicate that increases in lon-gevity and BMI are insufficient to explain the approximate doubling of knee OA prevalence that has occurred in the United States since the mid-20th century. Knee OA is thus more preventable than is commonly assumed, but prevention will require research on additional independent risk factors that either arose or have become amplified in the postindustrial era. arthritis | aging | obesity | mismatch disease | evolutionary medicine O steoarthritis (OA) is the most prevalent joint disease and a leading source of chronic pain and disability in the United States (1) and other developed nations (2). Knee OA accounts for more than 80% of the disease's total burden (2) and affects at least 19% of American adults aged 45 y and older (3). Substantial evidence indicates that knee OA is proximately caused by the breakdown of joint tissues from mechanical loading (4) and inflammation (5), but the deeper underlying causes of knee OA's high prevalence remain unclear and poorly tested, hindering efforts to prevent and treat the disease. Two recent public health trends, however, are commonly assumed to be dominant factors (6, 7). First, because knee OA's prevalence increases with age (8), the rise in life expectancy in the United States since the early 20th century is thought to have led to high knee OA levels among the elderly, with the presumption that, as people age, their senescing joint tissues accumulate more wear and tear from loading (9). Second, high body mass index (BMI) has become epidemic in the United States in recent decades and is a well-known risk factor for knee OA (8), probably because of the combined effects of joint overloading and adiposity-induced inflammation (10). Whether increases in longevity and BMI are responsible for current knee OA levels has never been tested, but this assumption has led many to view the disease's high prevalence as effectively unpreventable, since aging is untreatable, and the high BMI epidemic is intractable (8, 11). One underused yet potentially powerful way to identify and assess the risk factors responsible for current knee OA levels is to examine long-term changes in the disease's prevalence by comparing contemporary with historic and prehistoric populations (12). Epidemiological studies of present day populations are valuable but are limited in their ability to analyze risk factors that are now pervasive but used to be less common. It is difficult to find large samples of living Americans whose lifestyles, including physical activity levels and diet, resemble those of past generations. Although many variables cannot be measured and thus controlled in epidemiological studies of people living in the past, a major benefit of analyzing populations over historical and evolutionary time is to assess known risk factors under different environmental conditions and thus bring to light the effects of risk factors that might not be apparent or testable in modern populations alone. Furthermore, although knee OA is known to be ancient (12), we know very little about changes in its prevalence over time. Low levels of knee OA have been reported for some historic and prehistoric populations (13–17), suggesting that the disease's prevalence has recently increased, but these studies used different diagnostic criteria than those used to diagnose knee OA in living patients, used samples composed mostly of younger individuals, and did not account for BMI, complicating comparisons with modern epidemiological data. Here, we investigate long-term trends in knee OA prevalence in the United States and evaluate the effects of longevity and Significance Knee osteoarthritis is a highly prevalent, disabling joint disease with causes that remain poorly understood but are commonly attributed to aging and obesity. To gain insight into the eti-ology of knee osteoarthritis, this study traces long-term trends in the disease in the United States using large skeletal samples spanning from prehistoric times to the present. We show that knee osteoarthritis long existed at low frequencies, but since the mid-20th century, the disease has doubled in prevalence. Our analyses contradict the view that the recent surge in knee osteoarthritis occurred simply because people live longer and are more commonly obese. Instead, our results highlight the need to study additional, likely preventable risk factors that have become ubiquitous within the last half-century.
Studies of ancient human skeletal remains frequently proceed from the assumption that individuals... more Studies of ancient human skeletal remains frequently proceed from the assumption that individuals with robust limb bones and/or rugose, hypertrophic entheses can be inferred to have been highly physically active during life. Here, we experimentally test this assumption by measuring the effects of exercise on limb bone structure and entheseal morphology in turkeys. Growing females were either treated with a treadmill-running regimen for 10 weeks or served as controls. After the experiment, femoral cortical and trabecular bone structure were quantified with mCT in the mid-diaphysis and distal epiphysis, respectively , and entheseal morphology was quantified in the lateral epicondyle. The results indicate that elevated levels of physical activity affect limb bone structure but not entheseal morphology. Specifically, animals subjected to exercise displayed enhanced diaphyseal and trabecular bone architecture relative to controls, but no significant difference was detected between experimental groups in entheseal surface topography. These findings suggest that diaphyseal and trabecular structure are more reliable proxies than entheseal morphology for inferring ancient human physical activity levels from skeletal remains.
Anthropologists accept that mobility is a critical dimension of human culture, one that links eco... more Anthropologists accept that mobility is a critical dimension of human culture, one that links economy, technology, and social relations. Less often acknowledged is that mobility depends on complex and dynamic interactions between multiple levels of our biological organization, including anatomy, physiology, neu-robiology, and genetics. Here, we describe a novel experimental approach to examining the biological foundations of mobility, using mice from a long-term artificial selection experiment for high levels of voluntary exercise on wheels. In this experiment, mice from selectively bred lines have evolved to run roughly three times as far per day as those from nonselected control lines. We consider three insights gleaned from this experiment as foundational principles for the study of mobility from the perspective of biological evolution. First, an evolutionary change in mobility will necessarily be associated with alterations in biological traits both directly and indirectly connected to mobility. Second, changing mobility will result in trade-offs and constraints among some of the affected traits. Third, multiple solutions exist to altering mobility, so that various combinations of adjustments to traits linked with mobility can achieve the same overall behavioral outcome. We suggest that anthropological knowledge of variation in human mobility might be improved by greater research attention to its biological dimensions.
This study investigates the influence of genetic differentiation in determining worldwide heterog... more This study investigates the influence of genetic differentiation in determining worldwide heterogeneity in osteoporosis-related hip fracture rates. The results indicate that global variation in fracture incidence exceeds that expected on the basis of random genetic variance. Introduction Worldwide, the incidence of osteoporotic hip fractures varies considerably. This variability is believed to relate mainly to non-genetic factors. It is conceivable, however , that genetic susceptibility indeed differs across populations. Here, we present the first quantitative assessment of the effects of genetic differentiation on global variability in hip fracture rates. Methods We investigate the observed variance in publically reported age-standardized rates of hip fracture among 28 populations from around the world relative to the expected variance given the phylogenetic relatedness of these populations. The extent to which these variances are similar constitutes a Bphylogenetic signal,^ which was measured using the K statistic. Population genetic divergence was calculated using a robust array of genome-wide single nucleotide polymorphisms. Results While phylogenetic signal is maximized when K > 1, a K value of only 0.103 was detected in the combined-sex fracture rate pattern across the 28 populations, indicating that fracture rates vary more than expected based on phylogenetic relationships. When fracture rates for the sexes were analyzed separately, the degree of phylogenetic signal was also found to be small (females: K = 0.102; males: K = 0.081). Conclusions The lack of a strong phylogenetic signal underscores the importance of factors other than stochastic genetic diversity in shaping worldwide heterogeneity in hip fracture incidence.
Uploads