A Space & astronomy forum. SpaceBanter.com

Go Back   Home » SpaceBanter.com forum » Astronomy and Astrophysics » Research
Site Map Home Authors List Search Today's Posts Mark Forums Read Web Partners

Any complete standardized SN11 data out there?



 
 
Thread Tools Display Modes
  #1  
Old December 15th 04, 09:30 PM
sean
external usenet poster
 
Posts: n/a
Default Any complete standardized SN11 data out there?

(steve)
You could have saved Bjoern and me a lot of work if you had
simply asked what the k-correction is instead of guessing.


Actually if you look back through the last months posts you can
see that I do repeatedly ask for clarification on whether or not
the minuet formula is part of the k correction. And I never got a
clear answer on that so I had to assume it was until finally
now it is clarified to me that the two are seperate. It seems so
simple now , but believe me I did ask and no answer was given.
But thanks anyways for the info.
Here is another example of where I can only `guess` a process
yet I can understand well enough the overall concept. Its the
details I lack not having done the calculation myself. It is
the fitting process. I can tell that from what I understand
that when a template is fitted to SN1997ek data there are
at least two steps. Heres the formula from Goldhaber.
I(t) Imax = fR ((t-tmax)/s(1+z))=b
I have no idea what fR is or what a `function `is but I have
to assume that if one were to describe in words what happens to
the template in this fitting process is that essentially it is
normalized to the data peak and divided by 1+z. Looking at
fig 1 the template peak is much lower than the data peak so I
can only assume that to get the decay of the template from 1.54
onwards to fit the template one doesnt use the averaged day peak
value of 4.6 but rather its low end of the error margin is used.
That way the HST data and the other decay data fits well and the
peak data not so well but still within error margins. That is I
assume is called a `best fit`. So is that roughly correct if one
had to describe the formula in words? So in words what is
happening is that the chi square process calculates all the
different possible template lightcurves shapes where the
template lies within all the available error bars and
`Best fitting` is a complex calculation that produces the
optimum lightcurve shape that fits the most data as closely
as possible to the middle point of the error margin, With the
least amount of datapoints only being included at the outside of
their error margins? In other words a bad fit is where the
template has its peak at 4.6 but because the template wont be
within the error margins of the table data at all from about
day 20 onwards this is excluded. Is this roughly correct as a
verbal description of the minuet formula?
Because if it is then I believe that one can get an even better
fit to more of the data than Knops fit if one doesnt include
the 1+z transformation in the formula. And the reason why knop
doesnt use this result is he presupposes expansion which rules
out in his mind using any fitting that doesnt get transformed
by 1+z. If this is the case I would critisize him for
presupposing expansion in his fitting for the very purpose of
PROVING expansion. This would be very unscientific as really he
should have done best fits including those that did AND didnt
dilate the template timescale by 1+z rather than fits that only
included multiplying by 1+z.
So thats a fair argument dont you think? Of course it depends
on if I have understood the formula well enough. And if I havent
can I ask you .. where have I got my verbal description of the
minuet formula above wrong? After all, you do say` just ask`.

I have no idea where you get this. Are you looking at 1997ek
in Figure 1 from the Knop et al. paper? If I draw a horizontal
line between the 0.4 tick marks on each side of the graph, it
goes directly through the point for the HST reading. The time
is clearly before 30 days, but the exact value is hard to read
from the graph. I make it about 28 days or so, and this is
consistent with the times in Table 11.


Yes definitely the graph is hard to read and maybe its not
a good idea to base any final judgement from reading the graph.?
The problem is its the only way to confirm where the HST reading
sits on the template.
I also now refer to fig 1 I band 1997ek top right hand graph page
11. I even blow up the graph to 800% and count the
pixel heights between 0.4 and 0.5 (8 pixels) The pixel height of
the HST reading (5 pixels) . The point of the HST reading is 5
pixels high and its top pixel runs one pixel above 0.4 while its
bottom pixel runs 3 pixels below 0.4. That puts as I`ve said
before, the HST reading at about 0.38 seeing as the middle pixel
of the HST reading is parallel to the 0.38 pixel on the graph.
Small change I agree but if you then consider that 1 pixel on
the graph works out to 2 days on the time axis and if the middle
pixel of the HST reading hits the template 1 pixel later than
where the 0.4 pixel hits the template that technically puts
the HST reading at 1 pixel or 2 days later than 0.4 on the graph.
And in our argument way back on this thread heres your
calculations from the table.....
(Steves quote..)
"For 1997ek, we have from HST data in Table 11 a flux density of
1.54 at day 50846.7 and 0.75 at day 50858.8. This is a decay of
0.78 mag in 12.1 days or 16 days per magnitude. The next
measurement is 0.46 at day 50871.9, giving 0.53 mag in 13.1 days
or 25 days per magnitude. The error bars are about 15% on the
first decay time and 20% on the second one..."

And my argument was that if it takes 16 days to decay 1 mag from
day 50846.7 and 25 days to decay 1 mag from day 50871 then that
means on average every day later than 50846.7 the decay rate
increases by 1 day . In that case one can extrapolate that every
day before day 50846 if we follow the trend through it will take
1 day less to decay 1 mag. And as the pixels show on the graph the
HST reading you refer to is 2 days later(1 pixel later) than the
official 0.4 peak +1 reading which means that extraopolating back
from the HST reading one gets 14 days (16-2) for a 1 mag decay
from 0.4.
And thats what I have always said I get from the graphs.
But is this too close to call ? Maybe yes. Thats why I went
through all 11 to see what I get for 1 mag decay from 0.4 on the
graphs. And on average there is no or little time dilation. You can
check through yourself and see but I`m sure we`ll get a rough
agreement. Even if you lets say get a couple of days extra then
me for each SN it will still be in favor n average for no time
dilation because in fact 1997ek is the most preferable to a time
dilation argument at 1.4 which is midway between 1 and 1.86 .The
rest are much closer to 1 than 1997ek.

Sean
  #2  
Old December 16th 04, 10:12 AM
Bjoern Feuerbacher
external usenet poster
 
Posts: n/a
Default

sean wrote:
(steve)

You could have saved Bjoern and me a lot of work if you had
simply asked what the k-correction is instead of guessing.



Actually if you look back through the last months posts you can
see that I do repeatedly ask for clarification on whether or not
the minuet formula is part of the k correction.


There is no "minuet formula".

There is a fitting procedure (algorithm) with the title MINUIT.


And I never got a
clear answer on that so I had to assume it was until finally
now it is clarified to me that the two are seperate. It seems so
simple now, but believe me I did ask and no answer was given.
But thanks anyways for the info.
Here is another example of where I can only `guess` a process
yet I can understand well enough the overall concept. Its the
details I lack not having done the calculation myself. It is
the fitting process. I can tell that from what I understand
that when a template is fitted to SN1997ek data there are
at least two steps. Heres the formula from Goldhaber.
I(t) Imax = fR ((t-tmax)/s(1+z))=b


You probably mean I(t)/Imax = fR((t-tmax)/s(1+z))+b, which
one can obviously also write as
I(t) = Imax * ( fR((t-tmax)/w) + b ),
w = s(1+z). So one has four free parameters he Imax, tmax,
w and b.


I have no idea what fR is or what a `function `is


A function is simply something which assigns one value to another.
In this case, the function fR(t) gives for every time t the magnitude
of a SN, normalized so that the peak is at 1. I.e. the function fR
describes the known lightcurves of SNs Ia.


but I have
to assume that if one were to describe in words what happens to
the template in this fitting process is that essentially it is
normalized to the data peak and divided by 1+z.


No, that is not at all what is actually done.

What is done is (roughly) that the differences between the measured
fluxes (let's call them I_i) and the theoretical magnitudes
at the times of measurement t_i are computed, squared and
summed up:
sum_i (I_i - I(t_i))^2
For this sum, one searches then the minimum by varying the
four parameters Imax, tmax, w and b, until the sum is minimal
(i.e. until the deviations between the measured fluxes and the
theoretical curve becomes minimal). This is done using derivatives an
the like - things which you unfortunately probably never heard
of, if you don't even know what a function is. :-(

If you think there is something wrong with such an approach, please
consider that all of this is totally standard stuff, done in *every* are
of physics.

Oh, and please consider that what I described above is only a
rough description - in a more detailed analysis, one also has
to take the error margins into account.


Looking at
fig 1 the template peak is much lower than the data peak so I
can only assume that to get the decay of the template from 1.54
onwards to fit the template one doesnt use the averaged day peak
value of 4.6 but rather its low end of the error margin is used.
That way the HST data and the other decay data fits well and the
peak data not so well but still within error margins. That is I
assume is called a `best fit`. So is that roughly correct if one
had to describe the formula in words?


No, sorry. One does not use the "low end of the error margin".
One uses all the data and all error margins, and then tries to
minimize the sum of the squared differences between the measured
data and the theoretical curve, by varying the four free parameters.

Also, the formula *is* not the fit. The formula gives the
function *with* (or *to*) which the fit is performed.


So in words what is
happening is that the chi square process calculates all the
different possible template lightcurves shapes where the
template lies within all the available error bars and
`Best fitting` is a complex calculation that produces the
optimum lightcurve shape that fits the most data as closely
as possible to the middle point of the error margin. With the
least amount of datapoints only being included at the outside of
their error margins? In other words a bad fit is where the
template has its peak at 4.6 but because the template wont be
within the error margins of the table data at all from about
day 20 onwards this is excluded. Is this roughly correct as a
verbal description of the minuet formula?


Yes, this is roughly right - with the exception that there still
is no "minuet formula". ;-)



Because if it is then I believe that one can get an even better
fit to more of the data than Knops fit if one doesnt include
the 1+z transformation in the formula. And the reason why knop
doesnt use this result is he presupposes expansion which rules
out in his mind using any fitting that doesnt get transformed
by 1+z.


No, that makes no sense. s is a completely free parameter. If
there were no time dilation in the data, the fit would have
*shown* that! It would have yielded the result that s is
proportional to 1/(1+z). But what the fit actually showed is that s is
(roughly) constant.

Steve and you have been through this before; unfortunately, you
still have not understood this... :-(


[snip more of that]


So thats a fair argument dont you think?


No.

[snip]


I have no idea where you get this. Are you looking at 1997ek
in Figure 1 from the Knop et al. paper? If I draw a horizontal
line between the 0.4 tick marks on each side of the graph, it
goes directly through the point for the HST reading. The time
is clearly before 30 days, but the exact value is hard to read
from the graph. I make it about 28 days or so, and this is
consistent with the times in Table 11.



Yes definitely the graph is hard to read and maybe its not
a good idea to base any final judgement from reading the graph.?
The problem is its the only way to confirm where the HST reading
sits on the template.


Huh? Why? The table data gives you both the time and the magnitude
of the reading.


I also now refer to fig 1 I band 1997ek top right hand graph page
11. I even blow up the graph to 800% and count the
pixel heights between 0.4 and 0.5 (8 pixels) The pixel height of
the HST reading (5 pixels) . The point of the HST reading is 5
pixels high and its top pixel runs one pixel above 0.4 while its
bottom pixel runs 3 pixels below 0.4. That puts as I`ve said
before, the HST reading at about 0.38 seeing as the middle pixel
of the HST reading is parallel to the 0.38 pixel on the graph.


Well, as you yourself said, reading of the graphs is not very
accurate. Did you ever consider the possibility that already in
the "printing" of the graphs, some graphical errors crept in?


Small change I agree but if you then consider that 1 pixel on
the graph works out to 2 days on the time axis


Well, so my estimate of an error margin of 2 days was quite good,
apparently. ;-)


and if the middle
pixel of the HST reading hits the template 1 pixel later than
where the 0.4 pixel hits the template that technically puts
the HST reading at 1 pixel or 2 days later than 0.4 on the graph.
And in our argument way back on this thread heres your
calculations from the table.....
(Steves quote..)
"For 1997ek, we have from HST data in Table 11 a flux density of
1.54 at day 50846.7 and 0.75 at day 50858.8. This is a decay of
0.78 mag in 12.1 days or 16 days per magnitude. The next
measurement is 0.46 at day 50871.9, giving 0.53 mag in 13.1 days
or 25 days per magnitude. The error bars are about 15% on the
first decay time and 20% on the second one..."


Note that all three points lie quite well on the curve, so using
the table data instead of the curve directly is entirely
reasonable.


And my argument was that if it takes 16 days to decay 1 mag from
day 50846.7 and 25 days to decay 1 mag from day 50871 then that
means on average every day later than 50846.7 the decay rate
increases by 1 day.


Sorry, I do not understand what exactly you mean here. A decay
rate is measured in one over time (inverse time), so saying that
it increases by 1 day makes no sense. Do you mean that the
decay *time* increases by 1 day?



[snip]


Bye,
Bjoern
  #3  
Old December 19th 04, 11:45 AM
sean
external usenet poster
 
Posts: n/a
Default

From: Bjoern Feuerbacher )
Subject: Any complete standardized SN11 data out there?

Oh, and please consider that what I described above is only a
rough description - in a more detailed analysis, one also has
to take the error margins into account.


Thanks for the description. Much appreciated.

No, that makes no sense. s is a completely free parameter. If
there were no time dilation in the data, the fit would have
*shown* that! It would have yielded the result that s is
proportional to 1/(1+z). But what the fit actually showed
is that s is (roughly) constant.


I dont follow this bit about s. I dont think I said remove s or
that s isnt roughly constant? If I gave that impression I didnt
mean to. All I suggest is that the minuit formula could be redone
exactly as Knop does it with a fit to table 11 data except for
1 single change. Instead of the 1+z in the formula, relace that
with either 1+0 or a very low z like 1+0.01. Basically I would
like to try redoing Knops fit for 1997ek but without a time
dilation to see what happens.
Am I right in thinking you think that doing this is not allowed?
At the worst all that would happen is that there would be no
fit . (I wouldnt be taking s out of the formula or change s
in any way)

Sorry, I do not understand what exactly you mean here.
A decay rate is measured in one over time (inverse time), so
saying that it increases by 1 day makes no sense. Do you
mean that the decay *time* increases by 1 day?


Yes. So for each consecutive day the decay time increases by one
day so that starting on day 50847 (1 day after the HST 1.54
reading) it will take 17 days to decay 1 mag . And starting
from day 50848 it will take 18 days etcetc.
THanks for the reading you made of 1997ek peak+1 to peak+2
from the 1997ek template. You got 14 days I believe. Could I
possibly interest you in doing maybe 1 more? Like 1998be
or 1998as? Just to compare with what I got.


Sean
  #4  
Old December 20th 04, 12:03 PM
Bjoern Feuerbacher
external usenet poster
 
Posts: n/a
Default

sean wrote:
From: Bjoern Feuerbacher )
Subject: Any complete standardized SN11 data out there?


Oh, and please consider that what I described above is only a
rough description - in a more detailed analysis, one also has
to take the error margins into account.



Thanks for the description. Much appreciated.


For the third time I ask: do you have access to Mathematica
or Maple? If yes, I could try to tell you how to do the
fits for yourself.


No, that makes no sense. s is a completely free parameter. If
there were no time dilation in the data, the fit would have
*shown* that! It would have yielded the result that s is
proportional to 1/(1+z). But what the fit actually showed
is that s is (roughly) constant.



I dont follow this bit about s. I dont think I said remove s or
that s isnt roughly constant? If I gave that impression I didnt
mean to. All I suggest is that the minuit formula could be redone
exactly as Knop does it with a fit to table 11 data except for
1 single change. Instead of the 1+z in the formula, relace that
with either 1+0 or a very low z like 1+0.01. Basically I would
like to try redoing Knops fit for 1997ek but without a time
dilation to see what happens.


Then you would have only s instead of s*(1+z) in the formula,
and the result is obvious: the s resulting from the fit would
be proportional to 1+z. It simply can't be another way, due
to the math of the fitting process!


Am I right in thinking you think that doing this is not allowed?


It is allowed - but it won't change the result that there is
indeed time dilation.


At the worst all that would happen is that there would be no
fit. (I wouldnt be taking s out of the formula or change s
in any way)


No, there would indeed be a fit, as long as there is any free
parameter for stretching the time axis in the formula.



Sorry, I do not understand what exactly you mean here.
A decay rate is measured in one over time (inverse time), so
saying that it increases by 1 day makes no sense. Do you
mean that the decay *time* increases by 1 day?



Yes. So for each consecutive day the decay time increases by one
day so that starting on day 50847 (1 day after the HST 1.54
reading) it will take 17 days to decay 1 mag . And starting
from day 50848 it will take 18 days etc etc.


So the "each consecutive day" refers to the day when you
start the counting, or what?

Well, that sounds strange. I am by no means an expert on
SN light curves, but I would have thought that the decay is
exponential (because, AFAIK, the light comes from radioactive decay),
i.e. it takes the same amount of time to decay by a certain factor no
matter where one starts the counting.


THanks for the reading you made of 1997ek peak+1 to peak+2
from the 1997ek template. You got 14 days I believe. Could I
possibly interest you in doing maybe 1 more? Like 1998be
or 1998as? Just to compare with what I got.


1998as: 0.4 at day 34, 0.16 at day 60 (both again with
an error margin of +- 2 days), so the difference would be
26 +- 3 days.


Bye,
Bjoern
  #5  
Old December 22nd 04, 06:50 PM
sean
external usenet poster
 
Posts: n/a
Default

So the "each consecutive day" refers to the day when you
start the counting, or what?

Well, that sounds strange. I am by no means an expert on
SN light curves, but I would have thought that the decay is
exponential (because, AFAIK, the light comes from radioactive decay),
i.e. it takes the same amount of time to decay by a certain factor no
matter where one starts the counting.

Maybe but remember you also read 14 days off the graph which matches
my reading of 14.5 and conflictes with Steves 16 days.

From: Bjoern Feuerbacher )

For the third time I ask: do you have access to Mathematica
or Maple? If yes, I could try to tell you how to do the
fits for yourself.


No I dont sorry. I`ve never heard of them actually.

Then you would have only s instead of s*(1+z) in the formula,
and the result is obvious: the s resulting from the fit would
be proportional to 1+z. It simply can't be another way, due
to the math of the fitting process!


This would only be the case if the universe were expanding.
BUT if I am correct and there is no time dilation then you
would have s as roughly constant and the result a fit
of the restframe template to the data. Anyways you havent
tried the fit nor have I so it seems only speculation
at this point.
You may be interested to hear that I plotted the B and V band
restframe templates used by Knop and supplied as table data
in his and Goldhabers paper. I then scaled the time axis
by 1.86 % in fireworks and found they fit his templates
very well .Showing a) that he can show that a time dilated
template can fit the data and b) that scaling by percentages
in graphic softwqare like fireworks is acceptable and the
same as a mathematical calculation ofmultiplying each data
point in the tables by 1.86. I also multiplied by s and that
gives a better fit. I`m sure thatwil make you happy but the
next bit will not. Using the undilated restframe b band
template (ie before it was time dilated by 1.86) I simply
shifted the the template forward by about 10
days on the graph. And the result is that a fit of all
the table data within error margins is produced!!
In other words although a time dilated b band template can
give a fit of the table data, as Knop has done, a non time
dilated b band template also gives as good a fit to
the data. The only difference is which day one puts the
peak of the template. Furthermore I tried doing this with
most of the other 11 SN and found that the appropriate B or
V band non time dilated templates fit all the data in each
SN`s set of table data. I am sure a re-fit of all the SN
data with non dilated templates will verify my findings
and prove that there is at least as strong an argument
for no time dilation as there is for time dilation.
And if you dont believe me try a fit of 1997ek with z=1


1998as: 0.4 at day 34, 0.16 at day 60 (both again with
an error margin of +- 2 days), so the difference would be
26 +- 3 days.


Thanks for that reading. It`s close to my reading of 23 days
below in the table so that makes both your 1997ek and
1998as match my table readings reprinted below. (you had
14 days for 1997ek , I had 14.5). How did you get error
margins off the template? (It doesnt have error margins.)
Although for 1998as:a careful analysis at 800 % on acrobat
shows that peak+1 is at the middle pixel between 30 and 40
making it officially 35 for peak+1 and the 2nd last pixel of
seven between 50 and 60 making it roughly 58 for peak + 2.
Thats then 58-35=23 Exactly as I had found earlier and not
26 as you get.

Anyways if nothing else it show that my readings off the
templates are roughly correct. And on average the ratios
below are much closer to no time dilation than time dilation.
Although if you add in error margins I imagine you can argue
that the results can weakly support time dilation. Thats OK as
long as you realize that within error margins the results also
support no time dilation. In fact within error margins
all the data supports much more strongly a no time dilation
argument.


1997ek (restframe438nm) z=.86 14.5/10 =1.45should be1.86
1998eq (restframe469nm) z=.54 15/14.5 =1.03 " 1.54
1997ez (restframe457nm) z=.78 16/13 =1.2 " 1.78
1998as (restframe602nm) z=.35 23/22 =1.04 " 1.35
1998aw (restframe565nm) z=.44 27/20.5 =1.3 " 1.44
1998ax (restframe542nm) z=.50 30/22 =1.36 " 1.50
1998ay (restframe496nm) z=.64 20/17.5 =1.2 " 1.64
1998ba (restframe569nm) z=.43 24.5/22 =1.1 " 1.43
1998be (restframe496nm) z=.64 18/17.5 =1.02 " 1.64
1998bi (restframe467nm) z=.74 15.5/14.5=1.1 " 1.74
2000fr (restframe528nm) z=.54 24.5/22 =1.1 " 1.54


Regarding my predictions that GRB`s do not need `host
galaxies` and will in many cases have none even in hubble
deep field please note that grb041219 may be offering
verification of this prediction . It is a bright grb observed
as a compact point source suggesting even at limits of
observation no underlying host galaxy. If this bears out with
follow up observations we will have an example of how grbs will
appear too bright for the high redshift to be accomadated
by theory. It also emphasises the need for NASA to change the
xrt localization procedure and have the UVOT camera search
the entire xrt field of view rather than just any candidate
galaxies in the field of view. I also wonder if maybe Craig
now on the swift team could check the arrival times of grbs
from HETE and SWIFT and INTEGRAL to see if they do indeed
appear to give time of arrival localizations that do not
match observed localizations as I discussed with Craig
previously.


Sean
  #6  
Old December 24th 04, 10:03 AM
Steve Willner
external usenet poster
 
Posts: n/a
Default

In article ,
"sean" writes:
Using the undilated restframe b band
template (ie before it was time dilated by 1.86) I simply
shifted the the template forward by about 10
days on the graph. And the result is that a fit of all
the table data within error margins is produced!!


Claims such as yours are easy to make, but so far no one has
demonstrated any such thing. Please show your work in detail. For
example, give a table of the day number, the template value, and the
data value. Are you quite sure you didn't stretch the magnitude
axis? I thought we had agreed that the I-band light curve for 1997ek
does not match the B-band curve for 1995D. If that is so, 1997ek
cannot match the template unless a time dilation is put in.

--
Steve Willner Phone 617-495-7123
Cambridge, MA 02138 USA
(Please email your reply if you want to be sure I see it; include a
valid Reply-To address to receive an acknowledgement. Commercial
email may be sent to your ISP.)
  #7  
Old December 29th 04, 05:27 PM
sean
external usenet poster
 
Posts: n/a
Default

From: Steve Willner )

Claims such as yours are easy to make, but so far no one has
demonstrated any such thing. Please show your work in detail.
For example, give a table of the day number, the template
value, and the data value. Are you quite sure you didn't
stretch the magnitude axis?


Yes, I did move (though not stretch) the restframe B band
scp1997 template without keeping the baseline consistent with
the table data baseline. Looking at that now I dont think that
you would accept the results. But Im not sure why a stretching
of the table data magnitude is unacceptable. It seems that for
most of the Knop graphs he has to do that to `normalize` the
table data to 1.0
For instance in 1997ek table the peak HST reading is 3.89 but
on the graph it reads as a bit less than 1.0 .(I assume to do
this Knop divides 3.89 by about 4 to get the table data to fit
his template )That essentially is a stretching (although in
this a compression)of the mag scale is it not?
However for 1997eq the v band undilated template and data do
not need to be stretched in magnitude to fit. If I match day
50817 from 1997eq tables with day -5 on the undilated V band
template (goldhaber table 2) I get a pretty good match although
most are just outside the HST error margin by a very small
amount.The comparisons are as follows. The HST readings in
the first column are from the tables and include the relevent
+- error margin factored in. (ie 50817 is 0.91+-0.3)

1997eq V band template undilated except for s
50817 0.94 .95 out by +0.01
50824 0.88 .99 out by +0.1
50846 0.36 .36 matches
50855 0.25 .22 out by -0.03
50863 0.21 .15 out by -0.05
A rough calculation shows the v band template would only have
to be dilated by 1.2 (z=0.2) to fit the table data within error
margins if all my numbers are right. Compare that to its
redshift which is 1.54. In other words its a good fit undilated
and a best fit within error margins by z=1.2 and thus closer to
no time dilation.
I know you will say... "but the fit above isnt within error
margins"...but I notice that quite a few of Knops templates are
also not within error margins either by about the same amounts
as my 1997eq fit to the undilated v band template above. Ive
noted these deviations below so maybe you could check his paper
to confirm...
The 4th HST reading in 1998ay I band at 0.4 on about day 50
does not fit Knops template. Its off by about .08+- by my
calculation.
And for 1998ba all the ground based measurements do not fit
his template within their error margins at all nor more
importantly does the 4th HST reading on day 50947 (out by
about 0.1)
Also in 1997ez my calculations are that the second HST reading
does not match Knops time dilated template within error margins.
At 0.77 (0.81 to 0.74 error margins)it is below day 9.53 on
the dilated template by about .06 (Day 9.53 is Day 5(0.875)
on the undilated scp1997 template (table 2 Goldhaber)5*s*z=9.53)

I have also done a visual match of 1998ba to an undilated V
band template and it looks pretty good fit but before I do a
numbers match from the tables to confirm no time dilation maybe
you could look at these problems I notice from Knops calculations
.....
For 1998ba I band the 1st HST reading of about 0.925 on the graph
is listed in the tables as 5.74. (To normalize from 5.74 from the
tables to the graphs 0.925 I divide it by 6.2) Unfortunately the
other table datapoints when divided by 6.2 do not match the graph
datapoints at all. I found that for the rest of the HST readings
one needs to divide by 7.85 to normalize to the graph. Whats
happening here? Could you check that out? If this is a mistake
on Knops part his template will not fit the peak HST reading
from the 1998ba table data.

I thought we had agreed that the I-band light curve for 1997ek
does not match the B-band curve for 1995D. If that is so,
1997ek cannot match the template unless a time dilation is put
in.


I never agreed that the I band lightcurve did not match the b
band 1995D lightcurve. You agreed with yourself only. In fact
I posted to Bjorn proof that it does match 1995 D on dec 5 or
thereabouts. Heres the quote from my post...

"I plotted out the table data onto a graph. For multiple
measurements on the same day like 50817 or 50819 I average out for
a day average and plot that as one datapoint.(which Knop also does
incidentally) I then have a graph where peak+1 needs to occur at
1 mag less than the highest averaged day peak of 4.65 on day 50817.
This I work out to being 1.8. On the graph the only place this can
occur with available data is between 50835 and day 50846. Calcula-
ting a standard linear decay rate between those two points gives me
1.8 at about day 26. ETcetc for the next reading of peak+2 at 0.74..
......It also happens that the 12 day decay rate matches very closely
the 10 day rate for the restframe low z 1995D lightcurve which
I feel strengthens the validity of my methods . Its no accident
that they match as the `no time dilation` theory predicts it."
Sean
  #8  
Old January 5th 05, 10:37 AM
Bjoern Feuerbacher
external usenet poster
 
Posts: n/a
Default

sean wrote:

From: Bjoern Feuerbacher )

So the "each consecutive day" refers to the day when you
start the counting, or what?

Well, that sounds strange. I am by no means an expert on
SN light curves, but I would have thought that the decay is
exponential (because, AFAIK, the light comes from radioactive decay),
i.e. it takes the same amount of time to decay by a certain factor no
matter where one starts the counting.


Maybe but remember you also read 14 days off the graph which matches
my reading of 14.5 and conflictes with Steves 16 days.


Err, what on earth do these 14 days have to do with the
increase in lightcurve decay time mentioned above?



For the third time I ask: do you have access to Mathematica
or Maple? If yes, I could try to tell you how to do the
fits for yourself.



No I dont sorry. I`ve never heard of them actually.


A pity... :-(

Both are quite good programs for doing standard math tasks.
Do you have any other programming experience? Doing a chi squared
fit by hand is almost impossible.


Then you would have only s instead of s*(1+z) in the formula,
and the result is obvious: the s resulting from the fit would
be proportional to 1+z. It simply can't be another way, due
to the math of the fitting process!



This would only be the case if the universe were expanding.


*sigh* No. No. No.

This is a simple consequence of the *math*. It can not come out
another way! This has nothing to do with the actual physics.
It simply follows logically that if one uses two completely
equivalent fitting procedures for one and the same data, the
result will be the same!


BUT if I am correct and there is no time dilation then you
would have s as roughly constant


No. This simply can not happen! That would be a *mathematical*
impossibility!!!

*Please* try to understand the actual mathematical procedure;
then you will see that for yourself!


and the result a fit
of the restframe template to the data. Anyways you havent
tried the fit nor have I so it seems only speculation
at this point.


No. This is not speculation. This is a *mathematical* *fact*.
As sure as the fact that 2 + 2 = 3 + 1.


You may be interested to hear that I plotted the B and V band
restframe templates used by Knop and supplied as table data
in his and Goldhabers paper.


Huh? What, exactly, did you plot? Which tables? And how?


I then scaled the time axis
by 1.86 % in fireworks


Huh? Do you mean a factor of 1.86 instead of 1.86 % here?


and found they fit his templates very well.


Which templates?


Showing a) that he can show that a time dilated
template can fit the data and b) that scaling by percentages
in graphic softwqare like fireworks is acceptable and the
same as a mathematical calculation of multiplying each data
point in the tables by 1.86.


This will give you only a *visual* comparison. That is *much*
worse than an actual chi squared fit, which does not only give
you the *best* fit of the curve to *all* data, but also tells you
*quantitatively* how good the fit is!


I also multiplied by s and that
gives a better fit.


By which s???


I`m sure that wil make you happy but the
next bit will not. Using the undilated restframe b band
template (ie before it was time dilated by 1.86) I simply
shifted the the template forward by about 10
days on the graph. And the result is that a fit of all
the table data within error margins is produced!!
In other words although a time dilated b band template can
give a fit of the table data, as Knop has done, a non time
dilated b band template also gives as good a fit to
the data. The only difference is which day one puts the
peak of the template.


All done by mere visual comparison instead of an actual quantitative
mathematical analysis and hence quite worthless.


Furthermore I tried doing this with
most of the other 11 SN and found that the appropriate B or
V band non time dilated templates fit all the data in each
SN`s set of table data. I am sure a re-fit of all the SN
data with non dilated templates will verify my findings
and prove that there is at least as strong an argument
for no time dilation as there is for time dilation.
And if you dont believe me try a fit of 1997ek with z=1


Please come back when you have more than just visual comparisons.


1998as: 0.4 at day 34, 0.16 at day 60 (both again with
an error margin of +- 2 days), so the difference would be
26 +- 3 days.



Thanks for that reading. It`s close to my reading of 23 days
below in the table so that makes both your 1997ek and
1998as match my table readings reprinted below. (you had
14 days for 1997ek , I had 14.5). How did you get error
margins off the template? (It doesnt have error margins.)


Err, simply estimating how well one can read the data off the
graph. Since the pictures are not very sharp, an estimated
error margin of 2 days for reading the data off looks quite sensible.



Although for 1998as:a careful analysis at 800 % on acrobat
shows that peak+1 is at the middle pixel between 30 and 40
making it officially 35 for peak+1 and the 2nd last pixel of
seven between 50 and 60 making it roughly 58 for peak + 2.
Thats then 58-35=23 Exactly as I had found earlier and not
26 as you get.


Err, did you miss my comment where I pointed out that you can't
even be sure that the pictures are accurate, that there could
be problems with the printing of them already?

And *please* use error margins!


Anyways if nothing else it show that my readings off the
templates are roughly correct. And on average the ratios
below are much closer to no time dilation than time dilation.
Although if you add in error margins I imagine you can argue
that the results can weakly support time dilation.


Knop et al did an actual mathematical analysis, Goldhaber did
do an even stronger one. Both showed that there is *strong*
support for time dilation. Your hand wavy methods are still
far away from challenging that.


Thats OK as
long as you realize that within error margins the results also
support no time dilation. In fact within error margins
all the data supports much more strongly a no time dilation
argument.


1997ek (restframe438nm) z=.86 14.5/10 =1.45should be1.86
1998eq (restframe469nm) z=.54 15/14.5 =1.03 " 1.54
1997ez (restframe457nm) z=.78 16/13 =1.2 " 1.78
1998as (restframe602nm) z=.35 23/22 =1.04 " 1.35


According to my reading, (26 +- 3)/22 = 1.18 +- 0.14. Actually,
the error has to be greater, since the 22 days also have an
error margin.


1998aw (restframe565nm) z=.44 27/20.5 =1.3 " 1.44
1998ax (restframe542nm) z=.50 30/22 =1.36 " 1.50
1998ay (restframe496nm) z=.64 20/17.5 =1.2 " 1.64
1998ba (restframe569nm) z=.43 24.5/22 =1.1 " 1.43
1998be (restframe496nm) z=.64 18/17.5 =1.02 " 1.64
1998bi (restframe467nm) z=.74 15.5/14.5=1.1 " 1.74
2000fr (restframe528nm) z=.54 24.5/22 =1.1 " 1.54


[snip discussion of GRBs - irrelevant for this thread]


Bye,
Bjoern
  #9  
Old January 6th 05, 08:58 PM
sean
external usenet poster
 
Posts: n/a
Default

Bjoern Feuerbacher -
heidelberg.de) wrote..
Err, what on earth do these 14 days have to do
with the increase in lightcurve decay time
mentioned above?


As I mentioned in my earlier post I calculated that if
the decay were linear then using Steves calculations
I could calculate `back` and get 14 days for a 1 mag
decay from peak +1. As it happens both you and I
also `read` around 14 days for a 1 mag decay from the
graphs. So if you dont like me suggesting that the
decay is linear and gives 14 days then ignore that and
just use the 14 days that you and I read off the graph.
Same result either way.

Doing a chi squared fit by hand is almost impossible.


Maybe for a best fit but one can easily do the
calculations for 1 single fit to see if that works. What
I have done is visually found a close fit using graphic
software and noted the new peak day for the restframe
undilated template. I then multiply each day value from
the scp 1997 table by the s value (for that SN) and see
if this new calculated value falls within or close to the
error margins from the high redshift SN data tables. I
did this for a couple and posted them to Steve but
repost them here as Steve is unable to accept the fact
that a non dilated template fits as well as a time
dilated template to the available data. Notice how he
fails to respond to the points I raise about Knop
templates not fitting and he also fails to respond to
the proof that in 1997eq it can be shown that a non
dilated template fits as well as the dilated templates.
Furthermore if he or you can confirm what the peak HST
reading is for 1998ba I can show a good fit , using
numbers , and possibly better than the dilated template
that Knop uses. Here is the relevent part of my post
to Steve ...

If I match day 50817 from 1997eq tables with day -5 on
the undilated V band template (goldhaber table 2) I get
a pretty good match although most are just outside the
HST error margin by a very small amount.The comparisons
are as follows. The HST readings in the first column are
from the tables and include the relevent +- error margin
factored in. (ie 50817 is 0.91+-0.3) The second column
under `V band` is the template day times s with day
0 on the template matching day 50822 from the tables

1997eq V band template (undilated except for s)
50817 0.94 .95 out by +0.01
50824 0.88 .99 out by +0.1
50846 0.36 .36 matches
50855 0.25 .22 out by -0.03
50863 0.21 .15 out by -0.05
A rough calculation shows the v band template would
only have to be dilated by 1.2 (z=0.2) to fit the table
data within error margins if all my numbers are right.
Compare that to its redshift which is 1.54. In other
words its a good fit undilated and a best fit within
error margins by z=1.2 and thus closer to no time
dilation.
I know you will say... "but the fit above isnt within
error margins"...but I notice that quite a few of Knops
template fits are also not within the observed data error
margins by about the same amounts as my 1997eq fit to
the undilated v band template above. Ive noted these
deviations below so maybe you could check his paper
to confirm...
The 4th HST reading in 1998ay I band at 0.4 on about day
50 does not fit Knops template. Its off by about .08+- by
my calculation.
And for 1998ba all the ground based measurements do not
fit his template within their error margins at all nor
more importantly does the 4th HST reading on day 50947
(out by about 0.1)
Also in 1997ez my calculations are that the second HST
reading does not match Knops time dilated template
within error margins. At 0.77 (0.81 to 0.74 error
margins)it is below day 9.53 on the dilated template by
about .06 (Day 9.53 is Day 5(0.875) on the undilated
scp1997 template (table 2 Goldhaber) 5*s*z=9.53)

I have also done a visual match of 1998ba to an undilated
V band template and it looks pretty good fit but before
I do a numbers match from the tables to confirm no time
dilation maybe you could look at these problems I notice
from Knops calculations
.....
For 1998ba I band the 1st HST reading of about 0.925 on
the graph is listed in the tables as 5.74. (To normalize
from 5.74 from the tables to the graphs 0.925 I divide it
by 6.2) Unfortunately the other table datapoints when
divided by 6.2 do not match the graph datapoints at all.
I found that for the rest of the HST readings one needs
to divide by 7.85 to normalize to the graph. Whats
happening here? Could you check that out? If this is a
mistake on Knops part his template will not fit at all
near the peak HST reading from the 1998ba table data.

Then you would have only s instead of s*(1+z) in the
formula,and the result is obvious: the s resulting from
the fit would be proportional to 1+z. It simply can't
be another way, due to the math of the fitting process!


This would only be the case if the universe were expanding.


*sigh* No. No. No.

Sorry, I wasnt making myself clear enough. Yes in a
calculation for a non time dilated universe s would be
proportional to 1+z But because z is always 0 in the
calculations then 1+z is always written as 1+0 for every
high redshift SN. So you end up with s not being proportional
to redshift in a non time dilated universe simply because
in the maths the variable redshift z is always stated as 0 to
show that the restframe template is *not being time dilated*
It couldnt be more simple to understand. Here it is again...
To prove a non time dilated universe one DOES NOT DILATE THE
RESTFRAME TEMPLATE by z when comparing to a observed high
redshift SN. Understand now? If you dont, then tell me:
Why would I dilate the restframe template by z to show that
the observed high redshift SN in question wasnt dilated?
Or maybe I should jus ask you to try the calculations
yourself and replace z with 0 to see what you get.
I also multiplied by s and that
gives a better fit.


.By which s???


This is a silly question. By whatever the s value is
for that particular SN. What else did you think I meant?

1997ek (restframe438nm) z=.86 14.5/10 =1.45should be1.86
1998eq (restframe469nm) z=.54 15/14.5 =1.03 " 1.54
1997ez (restframe457nm) z=.78 16/13 =1.2 " 1.78
1998as (restframe602nm) z=.35 23/22 =1.04 " 1.35

According to my reading, (26 +- 3)/22 = 1.18 +- 0.14.
Actually, the error has to be greater, since the 22 days
also have an error margin.


As I have said already a more accurate reading of the graph
will give you 23 days = 1.04(+- 0.14.) Thats almost 1
which is much closer to no time dilation than time dilation
even with error margins. But if you insist on the 26 days
then your above numbers (1.18+- 0.14) are still much closer
to 1 than 1.35 within error margins so you seem to be
ignoring the fact that your own readings off the graph
support a no time dilation argument better than a time
dilation argument.
Not only that but as your two supplied readings out of the
available 11 are very close to my 2 from the 11 one can
presume that my other nine readings will also be roughly
correct even if you were to double check those. And I
have shown all 11 on average support no time dilation
more strongly than time dilation within error margins.

1998aw (restframe565nm) z=.44 27/20.5 =1.3 " 1.44
1998ax (restframe542nm) z=.50 30/22 =1.36 " 1.50
1998ay (restframe496nm) z=.64 20/17.5 =1.2 " 1.64
1998ba (restframe569nm) z=.43 24.5/22 =1.1 " 1.43
1998be (restframe496nm) z=.64 18/17.5 =1.02 " 1.64
1998bi (restframe467nm) z=.74 15.5/14.5=1.1 " 1.74
2000fr (restframe528nm) z=.54 24.5/22 =1.1 " 1.54
Sean
  #10  
Old January 7th 05, 12:06 PM
Bjoern Feuerbacher
external usenet poster
 
Posts: n/a
Default

sean wrote:

sean, as long as you
1) insist that visual comparisons are as valid as an
actual mathematical chi squared fit, and
2) refuse to do a proper error analysis for your readings
off the graphs,
this discussion is quite pointless.


Bjoern Feuerbacher -
heidelberg.de) wrote..

Err, what on earth do these 14 days have to do
with the increase in lightcurve decay time
mentioned above?



As I mentioned in my earlier post I calculated that if
the decay were linear then using Steves calculations
I could calculate `back` and get 14 days for a 1 mag
decay from peak +1.


I remember your calculation - but IIRC it did *not*
use a *linear* decay.

Also, what justifies the assumption of a linear decay?


[snip]


Doing a chi squared fit by hand is almost impossible.



Maybe for a best fit but one can easily do the
calculations for 1 single fit to see if that works.


Show your work.


What
I have done is visually found a close fit using graphic
software and noted the new peak day for the restframe
undilated template.


Visual fits are in no way as rigorous as an actual calculation.


I then multiply each day value from
the scp 1997 table by the s value (for that SN) and see
if this new calculated value falls within or close to the
error margins from the high redshift SN data tables. I
did this for a couple and posted them to Steve but
repost them here as Steve is unable to accept the fact
that a non dilated template fits as well as a time
dilated template to the available data.


By a *visual* comparison only.


Notice how he
fails to respond to the points I raise about Knop
templates not fitting and he also fails to respond to
the proof that in 1997eq it can be shown that a non
dilated template fits as well as the dilated templates.


He simply is fed up with your attitude that fits which
you do visually have as much validity as an actual
chi squared calculation.


Furthermore if he or you can confirm what the peak HST
reading is for 1998ba I can show a good fit , using
numbers, and possibly better than the dilated template
that Knop uses.


As long as you continue claiming that your visual fits
have as much (or even more) validitiy than Knops et al.
and Reiss' et al. actual mathematical statistical analysis,
I see no point in this.


Here is the relevent part of my post to Steve ...

If I match day 50817 from 1997eq tables with day -5 on
the undilated V band template (goldhaber table 2) I get
a pretty good match although most are just outside the
HST error margin by a very small amount.The comparisons
are as follows. The HST readings in the first column are
from the tables and include the relevent +- error margin
factored in. (ie 50817 is 0.91+-0.3) The second column
under `V band` is the template day times s with day
0 on the template matching day 50822 from the tables

1997eq V band template (undilated except for s)
50817 0.94 .95 out by +0.01
50824 0.88 .99 out by +0.1
50846 0.36 .36 matches
50855 0.25 .22 out by -0.03
50863 0.21 .15 out by -0.05


Nice. 5 data points for a single SN. Why on earth do you think this
proves anything?


A rough calculation shows the v band template would
only have to be dilated by 1.2 (z=0.2) to fit the table
data within error margins if all my numbers are right.
Compare that to its redshift which is 1.54. In other
words its a good fit undilated and a best fit within
error margins by z=1.2 and thus closer to no time
dilation.

I know you will say... "but the fit above isnt within
error margins"...but I notice that quite a few of Knops
template fits are also not within the observed data error
margins by about the same amounts as my 1997eq fit to
the undilated v band template above.


I told you how to do a chi squared fit in principle:
calculate the deviations between the theoretical curve
and the data for *all* data points, squared them and add them
up. Search for the minimum of the obtained number.

As long as you don't have done this, you have no basis for
claiming that your fit is better than the one uses by
Knop et al.


[snip more of this]


Then you would have only s instead of s*(1+z) in the
formula,and the result is obvious: the s resulting from
the fit would be proportional to 1+z. It simply can't
be another way, due to the math of the fitting process!



This would only be the case if the universe were expanding.



*sigh* No. No. No.

Sorry, I wasnt making myself clear enough. Yes in a
calculation for a non time dilated universe s would be
proportional to 1+z. But because z is always 0 in the
calculations then 1+z is always written as 1+0 for every
high redshift SN.


*sigh* No. No. No.

You *still* have not understood the actual method. Try again.


So you end up with s not being proportional
to redshift in a non time dilated universe


The *data* shows time dilation, no matter if you include
a factor 1+z in the fit or not. It is a *mathematical* *fact*
that both methods (using s or s(1+z)) will give the same result.
How often do we have to repeat this until you understand it?


simply because
in the maths the variable redshift z is always stated as 0


You make no sense at all.


[snip]


To prove a non time dilated universe one DOES NOT DILATE THE
RESTFRAME TEMPLATE by z when comparing to a observed high
redshift SN. Understand now?


Yes. I've understood all along what you want to do. But you *still* fail
to understand: when doing the chi squared analysis with s
only instead of s(1+z), the resulting s will *come out of
the mathematical analysis to be proportional to 1+z*. This
is an unavoidable *mathematical* *fact*.


If you dont, then tell me:
Why would I dilate the restframe template by z to show that
the observed high redshift SN in question wasnt dilated?


Huh? Sorry, I don't understand the question.


Or maybe I should jus ask you to try the calculations
yourself and replace z with 0 to see what you get.


Not necessary. The outcome is a *mathematical* *fact*.



I also multiplied by s and that
gives a better fit.


By which s???



This is a silly question.


No, not at all.


By whatever the s value is
for that particular SN. What else did you think I meant?


I did not know. I would not have asked if I knew!


1997ek (restframe438nm) z=.86 14.5/10 =1.45should be1.86
1998eq (restframe469nm) z=.54 15/14.5 =1.03 " 1.54
1997ez (restframe457nm) z=.78 16/13 =1.2 " 1.78
1998as (restframe602nm) z=.35 23/22 =1.04 " 1.35


I notice that you *still* do not bother to give error
margins.



According to my reading, (26 +- 3)/22 = 1.18 +- 0.14.
Actually, the error has to be greater, since the 22 days
also have an error margin.



As I have said already a more accurate reading of the graph
will give you 23 days = 1.04(+- 0.14.)


Why do you think you are in the position to judge if yours
or my reading is more accurate?

I *also* used a good magnification to read off the graph.


Thats almost 1
which is much closer to no time dilation than time dilation
even with error margins. But if you insist on the 26 days
then your above numbers (1.18+- 0.14) are still much closer
to 1 than 1.35 within error margins


So you conveniently ignore my remark that the error margin
actually has to greater?


so you seem to be
ignoring the fact that your own readings off the graph
support a no time dilation argument better than a time
dilation argument.


For this one single example, yes. So what? We are talking
about a quite big amount of data here.

Even if you manage to show this for all 11 of Knop's SNs
(and I strongly doubt that), you still have not addressed all the other
SNs used by previous researchers.

Anyway, as long as you refuse to do a proper error analysis,
such a discussion is moot.


Not only that but as your two supplied readings out of the
available 11 are very close to my 2 from the 11 one can
presume that my other nine readings will also be roughly
correct even if you were to double check those.


That's a rather strange jumping to conclusions. Because
we agree *roughly* on two, I have to agree on all the other
nine, too?


And I
have shown all 11 on average support no time dilation
more strongly than time dilation within error margins.

1998aw (restframe565nm) z=.44 27/20.5 =1.3 " 1.44
1998ax (restframe542nm) z=.50 30/22 =1.36 " 1.50
1998ay (restframe496nm) z=.64 20/17.5 =1.2 " 1.64
1998ba (restframe569nm) z=.43 24.5/22 =1.1 " 1.43
1998be (restframe496nm) z=.64 18/17.5 =1.02 " 1.64
1998bi (restframe467nm) z=.74 15.5/14.5=1.1 " 1.74
2000fr (restframe528nm) z=.54 24.5/22 =1.1 " 1.54


Show the error margins.


Bye,
Bjoern
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Any complete standardized SNIa data out there? Eric Flesch Research 77 December 15th 04 09:30 PM
Any complete standardized SNIa data out there? greywolf42 Astronomy Misc 0 November 2nd 04 09:04 PM
French's Primordial Study and Schramm & Turner, 1997 greywolf42 Astronomy Misc 19 July 11th 04 06:43 PM
FAQ-2-B: sci.space.tech reading list dave schneider Technology 11 June 10th 04 03:54 AM
FAQ-2-B: sci.space.tech reading list dave schneider Technology 23 January 20th 04 11:42 PM


All times are GMT +1. The time now is 11:45 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 SpaceBanter.com.
The comments are property of their posters.