Seasonality after 1st differencingWhen (and why) should you take the log of a distribution (of...
Augment Export function to support custom number formatting
What is the Leave No Trace way to dispose of coffee grounds?
Does a (nice) centerless group always have a centerless profinite completion?
What is the logic behind charging tax _in the form of money_ for owning property when the property does not produce money?
If there's something that implicates the president why is there then a national security issue? (John Dowd)
Was planting UN flag on Moon ever discussed?
Is Jesus the last Prophet?
The significance of kelvin as a unit of absolute temperature
How to befriend someone who doesn't like to talk?
Are the guests in Westworld forbidden to tell the hosts that they are robots?
Confused with atmospheric pressure equals plastic balloon’s inner pressure
Do empty drive bays need to be filled?
Rail-to-rail op-amp only reaches 90% of VCC, works sometimes, not everytime
Could a person damage a jet airliner - from the outside - with their bare hands?
Can you make an identity from this product?
Grandpa has another non math question
How was the airlock installed on the Space Shuttle mid deck?
YA book about blind creatures that live underground and take kid's eyes
Why did the World Bank set the global poverty line at $1.90?
What do Birth, Age, and Death mean in the first noble truth?
How to avoid typing 'git' at the begining of every Git command
Is there a DSLR/mirorless camera with minimal options like a classic, simple SLR?
Oil draining out shortly after turbo hose detached/broke
Housemarks (superimposed & combined letters, heraldry)
Seasonality after 1st differencing
When (and why) should you take the log of a distribution (of numbers)?Interpreting seasonality in ACF and PACF plotsTreating non-stationarity of time series in seasonal adjusted data with RTime Series: Seasonality and trendStationarity & seasonality in RContradiction in the ADF (Augmented Dickey-Fuller) and KPSS (Kwiatkowski–Phillips–Schmidt–Shin) tests for financial time seriesDoes seasonal differencing in SARIMA model take care of additive/ multiplicative seasonality?ACF indicates non-stationarity but but time series plot looks stationaryEffects of differencing on PACFHow to interpret autocorrelation plot?differencing in sARIMA modelsWhat values of ARIMA(p,d,q)(P,D,Q)[7] should I use?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}
$begingroup$
I am working with a financial time series (monthly frequency) and the raw data is not stationary according to ADF, KPSS. I then apply deflation (accounting for inflation), log transformation (to make an exponential trend linear) and lastly take the 1st differences. This series is not stationary.
When running the ACF/PACF on the first differences, I receive the following plot:
Which kind of suggests seasonality at 11 and 22 lags (This pattern was not visible before 1st differences). Does this imply I should apply another difference, now with lag 11 and potentially 22 to remove the seasonality?
EDIT: Thanks for the answers. The link to text data is here.
time-series stationarity seasonality acf-pacf
$endgroup$
add a comment |
$begingroup$
I am working with a financial time series (monthly frequency) and the raw data is not stationary according to ADF, KPSS. I then apply deflation (accounting for inflation), log transformation (to make an exponential trend linear) and lastly take the 1st differences. This series is not stationary.
When running the ACF/PACF on the first differences, I receive the following plot:
Which kind of suggests seasonality at 11 and 22 lags (This pattern was not visible before 1st differences). Does this imply I should apply another difference, now with lag 11 and potentially 22 to remove the seasonality?
EDIT: Thanks for the answers. The link to text data is here.
time-series stationarity seasonality acf-pacf
$endgroup$
add a comment |
$begingroup$
I am working with a financial time series (monthly frequency) and the raw data is not stationary according to ADF, KPSS. I then apply deflation (accounting for inflation), log transformation (to make an exponential trend linear) and lastly take the 1st differences. This series is not stationary.
When running the ACF/PACF on the first differences, I receive the following plot:
Which kind of suggests seasonality at 11 and 22 lags (This pattern was not visible before 1st differences). Does this imply I should apply another difference, now with lag 11 and potentially 22 to remove the seasonality?
EDIT: Thanks for the answers. The link to text data is here.
time-series stationarity seasonality acf-pacf
$endgroup$
I am working with a financial time series (monthly frequency) and the raw data is not stationary according to ADF, KPSS. I then apply deflation (accounting for inflation), log transformation (to make an exponential trend linear) and lastly take the 1st differences. This series is not stationary.
When running the ACF/PACF on the first differences, I receive the following plot:
Which kind of suggests seasonality at 11 and 22 lags (This pattern was not visible before 1st differences). Does this imply I should apply another difference, now with lag 11 and potentially 22 to remove the seasonality?
EDIT: Thanks for the answers. The link to text data is here.
time-series stationarity seasonality acf-pacf
time-series stationarity seasonality acf-pacf
edited 8 hours ago
abu
asked 9 hours ago
abuabu
699
699
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
The answer is no, there is no problem of seasonality and autocorrelation here.
ACF and PACF charts use mostly 95% confidence intervals. This means, that typically 5% of values happens to be outside this interval - even when process do not show any autocorrelation or partial autocorrelation. Such things just happen.
Also, seasonal series tend to have different ACF functions - they tend to have form of weaves as you can observe in this question.
New contributor
$endgroup$
$begingroup$
These plots do exhibit precisely those wave patterns you mention, apparently contradicting your conclusion.
$endgroup$
– whuber♦
8 hours ago
$begingroup$
I would not agree, that differentiated series differs significantly from the white noise, which also may be tested. ACF patterns for seasonal data should be much stronger. Besides - it is financial data. It rarely shows any patterns due to arbitration.
$endgroup$
– cure
8 hours ago
$begingroup$
How much stronger would "much" stronger be? You seem to arguing in a circular fashion: because you don't expect the series to exhibit seasonality, you cannot agree that the PACF and ACF show evidence of seasonality!
$endgroup$
– whuber♦
8 hours ago
$begingroup$
Significantly different from white noise.
$endgroup$
– cure
8 hours ago
add a comment |
$begingroup$
The answer is no because you may have injected this phenomenon as a result of transforming the data in an unwarranted fashion ... see the Slutsky Effect where a linear (weighted ) combinations of i.i.d. values leads to a series with auto-correlative structure . Slutsky http://www-history.mcs.st-andrews.ac.uk/Biographies/Slutsky.html Effect ... Unnecesaary differencing can INJECT variability. Consider the variance of a random process that is differenced OR unnecessarily filtered http://mathworld.wolfram.com/Slutzky-YuleEffect.html
Non-stationarity is a symptom with possibly many causes. One cause is a shift in the mean at one or more points in time. Another possible cause is a change in parameters at one or more points in time. Another cause is a deterministic change in error variance at one or more points in time. Prof. Spyros Makridakis wrote an article http://www.insead.edu/facultyresearch/research/doc.cfm?did=46900 of the danger of using differencing to render a series stationary.
When (and why) should you take the log of a distribution (of numbers)? discusses when you should take a power transform i.e. to decouple the relationship between the Expected Value and the variance of the model's residuals.
You may be injecting structure via unwarranted transformations ( differencing is a transformation) .
Simply adjusting for a contemporaneous series (inflation) may be incorrect as the Y variable may be impacted by changes in the X variable or lags of the X variable.
This is why we build SARMAX models https://autobox.com/pdfs/SARMAX.pdf.
Why don't you post your original data in a csv format and I and others may
be able to help .
EDITED AFTER RECEIPT OF DATA:
I took your 132 monthly values into AUTOBOX ( a piece of software that I have helped to develop ) and automatically developed a useful model . It has a number of advanced features that can be helpful.
Here is the data which clearly suggests that as the series gets higher the variability increases. An even "truer" statement is that the variance changes at one point in time (around period 54) and not pervasively suggesting that a Weighted least Squares would be more appropriate than a Log Transform . This will be found via the TSAY test described here https://onlinelibrary.wiley.com/doi/abs/10.1002/for.3980070102 with an excerpt here
The TSAY test shown here led to a first difference model (nearly second differences as suggested by the ar coefficients nearly summing to 1.0 ) here with 9 pulses/shocks and a positive level shift (intercept change) at period 68.
The model in more detail is here and here
The Actual , Fit and Forecast graph is here with MOnte-Carlo generated simulations leading to these forecasts and limits
The role of statistics is to separate the data into signal and noise thus the litmus test is "did the equation generate a suitable noise process" . I would say a loud "Yes" .
Here is the plot of the model's residuals with this acf
In summary a useful model requires that the data be treated for non-constant variance by employing Weighted Least Squares effectively discounting the values 54-132 . The arima model is (2,1,0)(0,0,0)12 with a constant and 1 level shift along with 9 pulses.
It can help to see a segment of the augmented data matrix with the pulses and level shift where the columns represent the latent deterministic structure that was "scraped" from the data .
Hope this helps you and the list better ( partially ) understand the extraction of signal from data. No seasonality is detected with the data given .
$endgroup$
$begingroup$
Thanks for a really detailed answer. I added a link to data.
$endgroup$
– abu
8 hours ago
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f412230%2fseasonality-after-1st-differencing%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The answer is no, there is no problem of seasonality and autocorrelation here.
ACF and PACF charts use mostly 95% confidence intervals. This means, that typically 5% of values happens to be outside this interval - even when process do not show any autocorrelation or partial autocorrelation. Such things just happen.
Also, seasonal series tend to have different ACF functions - they tend to have form of weaves as you can observe in this question.
New contributor
$endgroup$
$begingroup$
These plots do exhibit precisely those wave patterns you mention, apparently contradicting your conclusion.
$endgroup$
– whuber♦
8 hours ago
$begingroup$
I would not agree, that differentiated series differs significantly from the white noise, which also may be tested. ACF patterns for seasonal data should be much stronger. Besides - it is financial data. It rarely shows any patterns due to arbitration.
$endgroup$
– cure
8 hours ago
$begingroup$
How much stronger would "much" stronger be? You seem to arguing in a circular fashion: because you don't expect the series to exhibit seasonality, you cannot agree that the PACF and ACF show evidence of seasonality!
$endgroup$
– whuber♦
8 hours ago
$begingroup$
Significantly different from white noise.
$endgroup$
– cure
8 hours ago
add a comment |
$begingroup$
The answer is no, there is no problem of seasonality and autocorrelation here.
ACF and PACF charts use mostly 95% confidence intervals. This means, that typically 5% of values happens to be outside this interval - even when process do not show any autocorrelation or partial autocorrelation. Such things just happen.
Also, seasonal series tend to have different ACF functions - they tend to have form of weaves as you can observe in this question.
New contributor
$endgroup$
$begingroup$
These plots do exhibit precisely those wave patterns you mention, apparently contradicting your conclusion.
$endgroup$
– whuber♦
8 hours ago
$begingroup$
I would not agree, that differentiated series differs significantly from the white noise, which also may be tested. ACF patterns for seasonal data should be much stronger. Besides - it is financial data. It rarely shows any patterns due to arbitration.
$endgroup$
– cure
8 hours ago
$begingroup$
How much stronger would "much" stronger be? You seem to arguing in a circular fashion: because you don't expect the series to exhibit seasonality, you cannot agree that the PACF and ACF show evidence of seasonality!
$endgroup$
– whuber♦
8 hours ago
$begingroup$
Significantly different from white noise.
$endgroup$
– cure
8 hours ago
add a comment |
$begingroup$
The answer is no, there is no problem of seasonality and autocorrelation here.
ACF and PACF charts use mostly 95% confidence intervals. This means, that typically 5% of values happens to be outside this interval - even when process do not show any autocorrelation or partial autocorrelation. Such things just happen.
Also, seasonal series tend to have different ACF functions - they tend to have form of weaves as you can observe in this question.
New contributor
$endgroup$
The answer is no, there is no problem of seasonality and autocorrelation here.
ACF and PACF charts use mostly 95% confidence intervals. This means, that typically 5% of values happens to be outside this interval - even when process do not show any autocorrelation or partial autocorrelation. Such things just happen.
Also, seasonal series tend to have different ACF functions - they tend to have form of weaves as you can observe in this question.
New contributor
edited 9 hours ago
New contributor
answered 9 hours ago
curecure
1515
1515
New contributor
New contributor
$begingroup$
These plots do exhibit precisely those wave patterns you mention, apparently contradicting your conclusion.
$endgroup$
– whuber♦
8 hours ago
$begingroup$
I would not agree, that differentiated series differs significantly from the white noise, which also may be tested. ACF patterns for seasonal data should be much stronger. Besides - it is financial data. It rarely shows any patterns due to arbitration.
$endgroup$
– cure
8 hours ago
$begingroup$
How much stronger would "much" stronger be? You seem to arguing in a circular fashion: because you don't expect the series to exhibit seasonality, you cannot agree that the PACF and ACF show evidence of seasonality!
$endgroup$
– whuber♦
8 hours ago
$begingroup$
Significantly different from white noise.
$endgroup$
– cure
8 hours ago
add a comment |
$begingroup$
These plots do exhibit precisely those wave patterns you mention, apparently contradicting your conclusion.
$endgroup$
– whuber♦
8 hours ago
$begingroup$
I would not agree, that differentiated series differs significantly from the white noise, which also may be tested. ACF patterns for seasonal data should be much stronger. Besides - it is financial data. It rarely shows any patterns due to arbitration.
$endgroup$
– cure
8 hours ago
$begingroup$
How much stronger would "much" stronger be? You seem to arguing in a circular fashion: because you don't expect the series to exhibit seasonality, you cannot agree that the PACF and ACF show evidence of seasonality!
$endgroup$
– whuber♦
8 hours ago
$begingroup$
Significantly different from white noise.
$endgroup$
– cure
8 hours ago
$begingroup$
These plots do exhibit precisely those wave patterns you mention, apparently contradicting your conclusion.
$endgroup$
– whuber♦
8 hours ago
$begingroup$
These plots do exhibit precisely those wave patterns you mention, apparently contradicting your conclusion.
$endgroup$
– whuber♦
8 hours ago
$begingroup$
I would not agree, that differentiated series differs significantly from the white noise, which also may be tested. ACF patterns for seasonal data should be much stronger. Besides - it is financial data. It rarely shows any patterns due to arbitration.
$endgroup$
– cure
8 hours ago
$begingroup$
I would not agree, that differentiated series differs significantly from the white noise, which also may be tested. ACF patterns for seasonal data should be much stronger. Besides - it is financial data. It rarely shows any patterns due to arbitration.
$endgroup$
– cure
8 hours ago
$begingroup$
How much stronger would "much" stronger be? You seem to arguing in a circular fashion: because you don't expect the series to exhibit seasonality, you cannot agree that the PACF and ACF show evidence of seasonality!
$endgroup$
– whuber♦
8 hours ago
$begingroup$
How much stronger would "much" stronger be? You seem to arguing in a circular fashion: because you don't expect the series to exhibit seasonality, you cannot agree that the PACF and ACF show evidence of seasonality!
$endgroup$
– whuber♦
8 hours ago
$begingroup$
Significantly different from white noise.
$endgroup$
– cure
8 hours ago
$begingroup$
Significantly different from white noise.
$endgroup$
– cure
8 hours ago
add a comment |
$begingroup$
The answer is no because you may have injected this phenomenon as a result of transforming the data in an unwarranted fashion ... see the Slutsky Effect where a linear (weighted ) combinations of i.i.d. values leads to a series with auto-correlative structure . Slutsky http://www-history.mcs.st-andrews.ac.uk/Biographies/Slutsky.html Effect ... Unnecesaary differencing can INJECT variability. Consider the variance of a random process that is differenced OR unnecessarily filtered http://mathworld.wolfram.com/Slutzky-YuleEffect.html
Non-stationarity is a symptom with possibly many causes. One cause is a shift in the mean at one or more points in time. Another possible cause is a change in parameters at one or more points in time. Another cause is a deterministic change in error variance at one or more points in time. Prof. Spyros Makridakis wrote an article http://www.insead.edu/facultyresearch/research/doc.cfm?did=46900 of the danger of using differencing to render a series stationary.
When (and why) should you take the log of a distribution (of numbers)? discusses when you should take a power transform i.e. to decouple the relationship between the Expected Value and the variance of the model's residuals.
You may be injecting structure via unwarranted transformations ( differencing is a transformation) .
Simply adjusting for a contemporaneous series (inflation) may be incorrect as the Y variable may be impacted by changes in the X variable or lags of the X variable.
This is why we build SARMAX models https://autobox.com/pdfs/SARMAX.pdf.
Why don't you post your original data in a csv format and I and others may
be able to help .
EDITED AFTER RECEIPT OF DATA:
I took your 132 monthly values into AUTOBOX ( a piece of software that I have helped to develop ) and automatically developed a useful model . It has a number of advanced features that can be helpful.
Here is the data which clearly suggests that as the series gets higher the variability increases. An even "truer" statement is that the variance changes at one point in time (around period 54) and not pervasively suggesting that a Weighted least Squares would be more appropriate than a Log Transform . This will be found via the TSAY test described here https://onlinelibrary.wiley.com/doi/abs/10.1002/for.3980070102 with an excerpt here
The TSAY test shown here led to a first difference model (nearly second differences as suggested by the ar coefficients nearly summing to 1.0 ) here with 9 pulses/shocks and a positive level shift (intercept change) at period 68.
The model in more detail is here and here
The Actual , Fit and Forecast graph is here with MOnte-Carlo generated simulations leading to these forecasts and limits
The role of statistics is to separate the data into signal and noise thus the litmus test is "did the equation generate a suitable noise process" . I would say a loud "Yes" .
Here is the plot of the model's residuals with this acf
In summary a useful model requires that the data be treated for non-constant variance by employing Weighted Least Squares effectively discounting the values 54-132 . The arima model is (2,1,0)(0,0,0)12 with a constant and 1 level shift along with 9 pulses.
It can help to see a segment of the augmented data matrix with the pulses and level shift where the columns represent the latent deterministic structure that was "scraped" from the data .
Hope this helps you and the list better ( partially ) understand the extraction of signal from data. No seasonality is detected with the data given .
$endgroup$
$begingroup$
Thanks for a really detailed answer. I added a link to data.
$endgroup$
– abu
8 hours ago
add a comment |
$begingroup$
The answer is no because you may have injected this phenomenon as a result of transforming the data in an unwarranted fashion ... see the Slutsky Effect where a linear (weighted ) combinations of i.i.d. values leads to a series with auto-correlative structure . Slutsky http://www-history.mcs.st-andrews.ac.uk/Biographies/Slutsky.html Effect ... Unnecesaary differencing can INJECT variability. Consider the variance of a random process that is differenced OR unnecessarily filtered http://mathworld.wolfram.com/Slutzky-YuleEffect.html
Non-stationarity is a symptom with possibly many causes. One cause is a shift in the mean at one or more points in time. Another possible cause is a change in parameters at one or more points in time. Another cause is a deterministic change in error variance at one or more points in time. Prof. Spyros Makridakis wrote an article http://www.insead.edu/facultyresearch/research/doc.cfm?did=46900 of the danger of using differencing to render a series stationary.
When (and why) should you take the log of a distribution (of numbers)? discusses when you should take a power transform i.e. to decouple the relationship between the Expected Value and the variance of the model's residuals.
You may be injecting structure via unwarranted transformations ( differencing is a transformation) .
Simply adjusting for a contemporaneous series (inflation) may be incorrect as the Y variable may be impacted by changes in the X variable or lags of the X variable.
This is why we build SARMAX models https://autobox.com/pdfs/SARMAX.pdf.
Why don't you post your original data in a csv format and I and others may
be able to help .
EDITED AFTER RECEIPT OF DATA:
I took your 132 monthly values into AUTOBOX ( a piece of software that I have helped to develop ) and automatically developed a useful model . It has a number of advanced features that can be helpful.
Here is the data which clearly suggests that as the series gets higher the variability increases. An even "truer" statement is that the variance changes at one point in time (around period 54) and not pervasively suggesting that a Weighted least Squares would be more appropriate than a Log Transform . This will be found via the TSAY test described here https://onlinelibrary.wiley.com/doi/abs/10.1002/for.3980070102 with an excerpt here
The TSAY test shown here led to a first difference model (nearly second differences as suggested by the ar coefficients nearly summing to 1.0 ) here with 9 pulses/shocks and a positive level shift (intercept change) at period 68.
The model in more detail is here and here
The Actual , Fit and Forecast graph is here with MOnte-Carlo generated simulations leading to these forecasts and limits
The role of statistics is to separate the data into signal and noise thus the litmus test is "did the equation generate a suitable noise process" . I would say a loud "Yes" .
Here is the plot of the model's residuals with this acf
In summary a useful model requires that the data be treated for non-constant variance by employing Weighted Least Squares effectively discounting the values 54-132 . The arima model is (2,1,0)(0,0,0)12 with a constant and 1 level shift along with 9 pulses.
It can help to see a segment of the augmented data matrix with the pulses and level shift where the columns represent the latent deterministic structure that was "scraped" from the data .
Hope this helps you and the list better ( partially ) understand the extraction of signal from data. No seasonality is detected with the data given .
$endgroup$
$begingroup$
Thanks for a really detailed answer. I added a link to data.
$endgroup$
– abu
8 hours ago
add a comment |
$begingroup$
The answer is no because you may have injected this phenomenon as a result of transforming the data in an unwarranted fashion ... see the Slutsky Effect where a linear (weighted ) combinations of i.i.d. values leads to a series with auto-correlative structure . Slutsky http://www-history.mcs.st-andrews.ac.uk/Biographies/Slutsky.html Effect ... Unnecesaary differencing can INJECT variability. Consider the variance of a random process that is differenced OR unnecessarily filtered http://mathworld.wolfram.com/Slutzky-YuleEffect.html
Non-stationarity is a symptom with possibly many causes. One cause is a shift in the mean at one or more points in time. Another possible cause is a change in parameters at one or more points in time. Another cause is a deterministic change in error variance at one or more points in time. Prof. Spyros Makridakis wrote an article http://www.insead.edu/facultyresearch/research/doc.cfm?did=46900 of the danger of using differencing to render a series stationary.
When (and why) should you take the log of a distribution (of numbers)? discusses when you should take a power transform i.e. to decouple the relationship between the Expected Value and the variance of the model's residuals.
You may be injecting structure via unwarranted transformations ( differencing is a transformation) .
Simply adjusting for a contemporaneous series (inflation) may be incorrect as the Y variable may be impacted by changes in the X variable or lags of the X variable.
This is why we build SARMAX models https://autobox.com/pdfs/SARMAX.pdf.
Why don't you post your original data in a csv format and I and others may
be able to help .
EDITED AFTER RECEIPT OF DATA:
I took your 132 monthly values into AUTOBOX ( a piece of software that I have helped to develop ) and automatically developed a useful model . It has a number of advanced features that can be helpful.
Here is the data which clearly suggests that as the series gets higher the variability increases. An even "truer" statement is that the variance changes at one point in time (around period 54) and not pervasively suggesting that a Weighted least Squares would be more appropriate than a Log Transform . This will be found via the TSAY test described here https://onlinelibrary.wiley.com/doi/abs/10.1002/for.3980070102 with an excerpt here
The TSAY test shown here led to a first difference model (nearly second differences as suggested by the ar coefficients nearly summing to 1.0 ) here with 9 pulses/shocks and a positive level shift (intercept change) at period 68.
The model in more detail is here and here
The Actual , Fit and Forecast graph is here with MOnte-Carlo generated simulations leading to these forecasts and limits
The role of statistics is to separate the data into signal and noise thus the litmus test is "did the equation generate a suitable noise process" . I would say a loud "Yes" .
Here is the plot of the model's residuals with this acf
In summary a useful model requires that the data be treated for non-constant variance by employing Weighted Least Squares effectively discounting the values 54-132 . The arima model is (2,1,0)(0,0,0)12 with a constant and 1 level shift along with 9 pulses.
It can help to see a segment of the augmented data matrix with the pulses and level shift where the columns represent the latent deterministic structure that was "scraped" from the data .
Hope this helps you and the list better ( partially ) understand the extraction of signal from data. No seasonality is detected with the data given .
$endgroup$
The answer is no because you may have injected this phenomenon as a result of transforming the data in an unwarranted fashion ... see the Slutsky Effect where a linear (weighted ) combinations of i.i.d. values leads to a series with auto-correlative structure . Slutsky http://www-history.mcs.st-andrews.ac.uk/Biographies/Slutsky.html Effect ... Unnecesaary differencing can INJECT variability. Consider the variance of a random process that is differenced OR unnecessarily filtered http://mathworld.wolfram.com/Slutzky-YuleEffect.html
Non-stationarity is a symptom with possibly many causes. One cause is a shift in the mean at one or more points in time. Another possible cause is a change in parameters at one or more points in time. Another cause is a deterministic change in error variance at one or more points in time. Prof. Spyros Makridakis wrote an article http://www.insead.edu/facultyresearch/research/doc.cfm?did=46900 of the danger of using differencing to render a series stationary.
When (and why) should you take the log of a distribution (of numbers)? discusses when you should take a power transform i.e. to decouple the relationship between the Expected Value and the variance of the model's residuals.
You may be injecting structure via unwarranted transformations ( differencing is a transformation) .
Simply adjusting for a contemporaneous series (inflation) may be incorrect as the Y variable may be impacted by changes in the X variable or lags of the X variable.
This is why we build SARMAX models https://autobox.com/pdfs/SARMAX.pdf.
Why don't you post your original data in a csv format and I and others may
be able to help .
EDITED AFTER RECEIPT OF DATA:
I took your 132 monthly values into AUTOBOX ( a piece of software that I have helped to develop ) and automatically developed a useful model . It has a number of advanced features that can be helpful.
Here is the data which clearly suggests that as the series gets higher the variability increases. An even "truer" statement is that the variance changes at one point in time (around period 54) and not pervasively suggesting that a Weighted least Squares would be more appropriate than a Log Transform . This will be found via the TSAY test described here https://onlinelibrary.wiley.com/doi/abs/10.1002/for.3980070102 with an excerpt here
The TSAY test shown here led to a first difference model (nearly second differences as suggested by the ar coefficients nearly summing to 1.0 ) here with 9 pulses/shocks and a positive level shift (intercept change) at period 68.
The model in more detail is here and here
The Actual , Fit and Forecast graph is here with MOnte-Carlo generated simulations leading to these forecasts and limits
The role of statistics is to separate the data into signal and noise thus the litmus test is "did the equation generate a suitable noise process" . I would say a loud "Yes" .
Here is the plot of the model's residuals with this acf
In summary a useful model requires that the data be treated for non-constant variance by employing Weighted Least Squares effectively discounting the values 54-132 . The arima model is (2,1,0)(0,0,0)12 with a constant and 1 level shift along with 9 pulses.
It can help to see a segment of the augmented data matrix with the pulses and level shift where the columns represent the latent deterministic structure that was "scraped" from the data .
Hope this helps you and the list better ( partially ) understand the extraction of signal from data. No seasonality is detected with the data given .
edited 7 hours ago
answered 9 hours ago
IrishStatIrishStat
22.3k42345
22.3k42345
$begingroup$
Thanks for a really detailed answer. I added a link to data.
$endgroup$
– abu
8 hours ago
add a comment |
$begingroup$
Thanks for a really detailed answer. I added a link to data.
$endgroup$
– abu
8 hours ago
$begingroup$
Thanks for a really detailed answer. I added a link to data.
$endgroup$
– abu
8 hours ago
$begingroup$
Thanks for a really detailed answer. I added a link to data.
$endgroup$
– abu
8 hours ago
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f412230%2fseasonality-after-1st-differencing%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown