Global modeling with automated ML: impact of one big beautiful Bill on Big Tech | R-Bloggers

Global modeling with automated ML: impact of one big beautiful Bill on Big Tech | R-Bloggers

2 minutes, 37 seconds Read

[This article was first published on DataGeeek, and kindly contributed to R-bloggers]. (You can report problems here about the content on this page)


Do you want to share your content on R-bloggers? Click here if you have a blog, or here If you don’t.

Morgan Stanley analysts are of the opinion that one big great account will be a blessing for Big Tech, because it will offer an intake of money to AI Giants, which improves their dominance in future AI competitions.

But the rates of 1 August of Trump, which compensate for tax cuts in the aforementioned account, did not seem to benefit the technology companies, according to the graph below. Google and Meta seem to be resilient compared to Amazon, probably their advertisements.

Source code:

library(tidymodels)
library(tidyverse)
library(tidyquant)
library(timetk)
library(modeltime.h2o)

#Amazon
df_amazon <- 
  tq_get("AMZN") %>% 
  select(date, Amazon = close)

#META
df_meta <- 
  tq_get("META") %>% 
  select(date, META = close)

#Google
df_google <- 
  tq_get("GOOGL") %>% 
  select(date, Google = close)

#Merging the datsets
df_merged <- 
  df_amazon %>% 
  left_join(df_meta) %>% 
  left_join(df_google) %>% 
  drop_na() %>% 
  filter(date >= last(date) - months(12)) %>% 
  pivot_longer(-date,
               names_to = "id",
               values_to = "value") %>% 
  mutate(id = as_factor(id)) 
  
  
#Train/Test Splitting
splits <- 
  df_merged %>% 
  time_series_split(
    assess     = "15 days", 
    cumulative = TRUE
  )


#Recipe
recipe_spec <- 
  recipe(value ~ ., data = training(splits)) %>%
  step_timeseries_signature(date) 

train_tbl <- training(splits) %>% bake(prep(recipe_spec), .)
test_tbl  <- testing(splits) %>% bake(prep(recipe_spec), .)


#Initialize H2O
h2o.init(
  nthreads = -1,
  ip       = 'localhost',
  port     = 54321
)



#Model specificatiom and fitting
model_spec <- automl_reg(mode = 'regression') %>%
  set_engine(
    engine                     = 'h2o',
    max_runtime_secs           = 5, 
    max_runtime_secs_per_model = 3,
    max_models                 = 3,
    nfolds                     = 5,
    exclude_algos              = c("DeepLearning"),
    verbosity                  = NULL,
    seed                       = 98765
  ) 


model_fitted <- 
  model_spec %>%
  fit(value ~ ., data = train_tbl)

#Modeltime Table
model_tbl <- 
  modeltime_table(
  model_fitted
  )




#Calibrate by ID
calib_tbl <- 
  model_tbl %>%
  modeltime_calibrate(
    new_data = test_tbl, 
    id       = "id"
  )

#Measure Test Accuracy

#Global Accuracy
calib_tbl %>% 
  modeltime_accuracy(acc_by_id = FALSE) %>% 
  table_modeltime_accuracy(.interactive = FALSE)

#Local Accuracy
calib_tbl %>% 
  modeltime_accuracy(acc_by_id = TRUE) %>% 
  table_modeltime_accuracy(.interactive = TRUE)



#Prediction Intervals
calib_tbl %>%
  modeltime_forecast(
    new_data    = test_tbl,
    actual_data = df_merged %>% filter(date >= as.Date("2025-07-18")),
    conf_by_id  = TRUE
  ) %>%
  group_by(id) %>%
  plot_modeltime_forecast(
    .facet_ncol  = 2,
    .interactive = FALSE,
    .line_size = 1
  )  +
  labs(title = "Global Modeling with Automated ML", 
       subtitle = "Predictive Intervals of GBM Model", 
       y = "", x = "") + 
  scale_y_continuous(labels = scales::label_currency()) +
  scale_x_date(labels = scales::label_date("%b %d"),
               date_breaks = "4 days") +
  theme_tq(base_family = "Roboto Slab", base_size = 16) +
  theme(plot.subtitle = ggtext::element_markdown(face = "bold"),
        plot.title = element_text(face = "bold"),
        strip.text = element_text(face = "bold"),
        #axis.text.x = element_text(angle = 60, hjust = 1, vjust = 1),
        legend.position = "none")


#Global #modeling #automated #impact #big #beautiful #Bill #Big #Tech #RBloggers

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *