Paste raw output from summary(), fixest, or glm(). Customize titles, labels, column groups, and formatting. Export to LaTeX, Word, or PDF.
No account · No credit card · No limits
> model <- lm(mpg ~ wt + hp, data = mtcars)
> summary(model)
...
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 37.22727 1.59879 23.285 < 2e-16 ***
wt -3.87783 0.63273 -6.131 1.12e-06 ***
hp -0.03177 0.00903 -3.519 0.00145 **
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1
Multiple R-squared: 0.8268, Adjusted R-squared: 0.8148
> model2 <- lm(mpg ~ wt + hp + am + qsec, data = mtcars)
> summary(model2)
...
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 17.34951 9.31770 1.862 0.07362 .
wt -3.23883 0.84596 -3.829 0.00069 ***
hp -0.01785 0.01193 -1.495 0.14666
am 2.92950 1.39761 2.095 0.04576 *
qsec 0.82104 0.44408 1.849 0.07573 .
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1
Multiple R-squared: 0.8624, Adjusted R-squared: 0.842
Just paste your console output. No need to install stargazer, modelsummary, or any other package — customize everything visually, then export to LaTeX, Word, or PDF.
Table 1: Determinants of Fuel Efficiency — Multi-Model Analysis
| OLS | WLS | Robust | |||
|---|---|---|---|---|---|
| (1) | (2) | (3) | (4) | (5) | |
| Weight (1000 lbs) | −5.34*** | −3.37** | −4.91*** | −5.18*** | −3.48** |
| (0.559) | (0.946) | (0.612) | (0.583) | (0.991) | |
| Horsepower | −0.018 | −0.032** | −0.021 | ||
| (0.015) | (0.011) | (0.016) | |||
| Manual trans. | 1.478 | 1.941 | 1.612 | ||
| (1.441) | (1.402) | (1.478) | |||
| Quarter mile (s) | 0.558 | 0.612 | 0.491 | ||
| (0.539) | (0.551) | (0.562) | |||
| N | 32 | 32 | 32 | 32 | 32 |
| R2 | 0.753 | 0.856 | 0.768 | 0.749 | 0.854 |
| Adj. R2 | 0.745 | 0.828 | 0.761 | 0.740 | 0.824 |
Standard errors in parentheses. *** p<0.001, ** p<0.01, * p<0.05. Dep. var.: miles per gallon. WLS weighted by 1/wt. Robust uses HC3 SE. Data: mtcars (n=32).
Every detail is customizable. Rename variables, change fonts, adjust significance stars, group columns — all without touching LaTeX.
Edit data directly
Rename variables, change coefficients, add or remove rows — click the table to edit anything.
Style & formatting
Choose between classic, three-line, grid, or minimal table styles. Adjust borders and spacing.
Font & size
Pick from Computer Modern, Palatino, Times, Charter, and more. Scale from tiny to large.
Significance stars
Supports both R convention (p<0.001/0.01/0.05) and economics convention (p<0.01/0.05/0.1). Fully customizable.
Table 2: OLS Estimates with Progressive Controls
| Dependent variable: log(wage) | ||||
|---|---|---|---|---|
| (1) Baseline | (2) +Demo. | (3) +Ind. FE | (4) +Region FE | |
| Education | 0.0891*** | 0.0854*** | 0.0823*** | 0.0792*** |
| (0.0063) | (0.0058) | (0.0055) | (0.0054) | |
| Experience | 0.0041** | 0.0039** | 0.0035** | 0.0033** |
| (0.0017) | (0.0016) | (0.0015) | (0.0015) | |
| Female | −0.2964*** | −0.2812*** | −0.2743*** | |
| (0.0358) | (0.0341) | (0.0338) | ||
| Married | 0.0534* | 0.0412 | 0.0389 | |
| (0.0301) | (0.0289) | (0.0287) | ||
| Industry FE | No | No | Yes | Yes |
| Region FE | No | No | No | Yes |
| N | 1,526 | 1,526 | 1,526 | 1,526 |
| R² | 0.159 | 0.213 | 0.241 | 0.258 |
Notes: *** p<0.01, ** p<0.05, * p<0.1. Robust standard errors in parentheses.
Column groups
Group columns under shared headers like "OLS" or "Logit". Perfect for multi-panel tables.
Titles & footnotes
Add a table title with auto-numbering, plus footnotes for methodology or data source notes.
AI refinement
Describe any change in plain English — "add a title", "remove the intercept", "use 3 decimal places".
Export anywhere
Download as PNG, PDF (cropped or A4), Word, or copy the LaTeX code directly into your paper.
Table 2: OLS Estimates with Progressive Controls
| Dependent variable: log(wage) | ||||
|---|---|---|---|---|
| (1) Baseline | (2) +Demo. | (3) +Ind. FE | (4) +Region FE | |
| Education | 0.0891*** | 0.0854*** | 0.0823*** | 0.0792*** |
| (0.0063) | (0.0058) | (0.0055) | (0.0054) | |
| Experience | 0.0041** | 0.0039** | 0.0035** | 0.0033** |
| (0.0017) | (0.0016) | (0.0015) | (0.0015) | |
| Female | −0.2964*** | −0.2812*** | −0.2743*** | |
| (0.0358) | (0.0341) | (0.0338) | ||
| Married | 0.0534* | 0.0412 | 0.0389 | |
| (0.0301) | (0.0289) | (0.0287) | ||
| Industry FE | No | No | Yes | Yes |
| Region FE | No | No | No | Yes |
| N | 1,526 | 1,526 | 1,526 | 1,526 |
| R² | 0.159 | 0.213 | 0.241 | 0.258 |
Notes: *** p<0.01, ** p<0.05, * p<0.1. Robust standard errors in parentheses.
Summary statistics, correlation matrices, and more — all supported out of the box.
Table 3: Summary Statistics — Iris Dataset
| Mean | SD | Min | Max | N | |
|---|---|---|---|---|---|
| Sepal length | 5.843 | 0.828 | 4.300 | 7.900 | 150 |
| Sepal width | 3.057 | 0.436 | 2.000 | 4.400 | 150 |
| Petal length | 3.758 | 1.765 | 1.000 | 6.900 | 150 |
| Petal width | 1.199 | 0.762 | 0.100 | 2.500 | 150 |
Data: Fisher's iris dataset. Three species: setosa, versicolor, virginica.
Table 4: Correlation Matrix
| mpg | wt | hp | qsec | |
|---|---|---|---|---|
| mpg | 1.000 | |||
| wt | −0.868*** | 1.000 | ||
| hp | −0.776*** | 0.659*** | 1.000 | |
| qsec | 0.419* | −0.175 | −0.708*** | 1.000 |
Data: mtcars (n=32). *** p<0.001, ** p<0.01, * p<0.05
Table 2: Logistic Regression — Transmission Type
| (1) Logit | (2) Logit | |
|---|---|---|
| Weight (1000 lbs) | −4.024** | −6.418** |
| (1.654) | (3.184) | |
| Horsepower | −0.068 | |
| (0.062) | ||
| Quarter mile time (s) | 2.148* | |
| (1.115) | ||
| N | 32 | 32 |
| AIC | 24.83 | 21.47 |
Standard errors in parentheses. Dependent variable: am (1=manual).
*** p<0.001, ** p<0.01, * p<0.05
Paste any R output — summary(), stargazer, fixest, or modelsummary. tables.pub auto-detects the format.
Journal of Labor Economics · Vol. 44, No. 2 · 2026
Sarah M. Chen and David A. Rodriguez
Department of Economics, University of Michigan
Abstract
This paper re-examines the returns to education and labor market experience using data from the Current Population Survey (2018–2024). We estimate Mincerian wage equations with progressively richer specifications. Our preferred specification yields a return to education of 8.2 percent per year of schooling. IV estimates using distance to college as an instrument suggest modest upward bias in OLS, consistent with positive selection.
3. Results
Table 1 reports robustness checks using alternative estimators. The IV specification instruments education with distance to the nearest college. The first-stage F-statistic of 42.1 exceeds conventional thresholds for weak instruments. The Heckman selection model addresses potential sample selection, yielding a positive and significant Mills ratio.
Table 1: Determinants of Fuel Efficiency
| OLS | |||
|---|---|---|---|
| (1) Baseline | (2) Core | (3) Full | |
| Weight (1000 lbs) | −5.344*** | −2.879** | −3.367** |
| (0.559) | (0.905) | (0.946) | |
| Horsepower | −0.037*** | −0.018 | |
| (0.010) | (0.015) | ||
| Manual transmission | 2.084 | 1.478 | |
| (1.376) | (1.441) | ||
| Quarter mile time (s) | 0.558 | ||
| (0.539) | |||
| Cylinders | −0.419 | ||
| (0.607) | |||
| N | 32 | 32 | 32 |
| R2 | 0.753 | 0.840 | 0.856 |
| Adj. R2 | 0.745 | 0.823 | 0.828 |
Standard errors in parentheses. Dependent variable: miles per gallon (mpg).
*** p<0.001, ** p<0.01, * p<0.05
The quantile regression at the median yields a somewhat smaller education coefficient of 0.079, suggesting that returns are higher in the upper tail of the conditional wage distribution. Across all three estimators, the gender gap remains large and precisely estimated.
3.2 Progressive Specifications
Table 2 presents our OLS estimates with progressive covariate adjustment. Column (1) reports the baseline Mincerian specification. Columns (2)–(4) add demographic controls, industry fixed effects, and region fixed effects. The return to schooling declines modestly from 8.9 to 7.9 percent.
Table 2: Logistic Regression — Transmission Type
| (1) Logit | (2) Logit | |
|---|---|---|
| Weight (1000 lbs) | −4.024** | −6.418** |
| (1.654) | (3.184) | |
| Horsepower | −0.068 | |
| (0.062) | ||
| Quarter mile time (s) | 2.148* | |
| (1.115) | ||
| N | 32 | 32 |
| AIC | 24.83 | 21.47 |
Standard errors in parentheses. Dependent variable: am (1=manual).
*** p<0.001, ** p<0.01, * p<0.05
The gender wage gap is substantial and persistent. The coefficient on Female implies that women earn approximately 27–30 log points less than men across all specifications. The marriage premium is positive but loses significance once we account for industry composition.
3.3 Heterogeneous Returns
Table 3 examines heterogeneity in the returns to education by gender and age. The returns to education are higher for women (0.089) than for men (0.076), and higher for workers under 40 (0.091) than for older workers (0.071). The urban premium is robust across all subgroups.
Table 3: Summary Statistics — Iris Dataset
| Mean | SD | Min | Max | N | |
|---|---|---|---|---|---|
| Sepal length | 5.843 | 0.828 | 4.300 | 7.900 | 150 |
| Sepal width | 3.057 | 0.436 | 2.000 | 4.400 | 150 |
| Petal length | 3.758 | 1.765 | 1.000 | 6.900 | 150 |
| Petal width | 1.199 | 0.762 | 0.100 | 2.500 | 150 |
Data: Fisher's iris dataset. Three species: setosa, versicolor, virginica.
4. Discussion
Our estimates align closely with the meta-analytic findings of Card (1999) and the more recent work of Autor (2014). The finding that returns are higher for women than men is consistent with Dougherty (2005), who argues that female college graduates benefit disproportionately from access to professional occupations.
References
Autor, D. H. (2014). Skills, education, and the rise of earnings inequality. Science, 344(6186), 843–851.
Card, D. (1999). The causal effect of education on earnings. Handbook of Labor Economics, 3, 1801–1863.
Dougherty, C. (2005). Why are the returns to schooling higher for women than for men? Journal of Human Resources, 40(4), 969–988.
Mincer, J. (1974). Schooling, Experience, and Earnings. Columbia University Press.
All tables generated with tables.pub — scroll to browse
Skip the stargazer/modelsummary setup. Paste your R output and get a publication-ready table in seconds — no packages to install.