[Reliable_computing] [BULK] Re: sinc function for intervals
Ralph Baker Kearfott
C00255736 at louisiana.edu
Sun Jul 7 19:03:32 CDT 2019
Brian,
Now that you mention it, yes, I need to amend my previous statement.
Yes, if you plug in intervals into the terms in the Taylor series, that
won't give you a reasonable result. The thing to do would be to
plug the point values (suitably represented as an interval) into
the terms and an interval into the error term.
On the other hand, the best way, if possible, is to use point values
of known accuracy along with known intervals in which the function
is monotonic.
Yes, people do use Taylor series in interval computations. Consider
earlier work of Berz and Makino, for example.
Baker
On 7/7/19 6:08 PM, Brian Kennedy wrote:
> I made a cut & paste error in my prior email … corrected in CAPS below.
> And then I wanted to comment/query on a few of Alan’s questions...
>
> On Q1, if x contains 0, then absolutely yes the sup of sinc(x) should be exactly 1.
> If there is a counter-argument to that, I’d like to understand that. ??
>
> On Q3, my answer is “both”: an interval does contain all the values between its endpoints (3b, yes);
> when computing interval sinc(x), the returned interval should guarantee to include all possible results
> of sinc(x) (3a, yes), but the resultant interval may have to include extra values due to the fact that it
> is not returning a general set, but a simple interval (which is necessarily a contiguous set from one
> endpoint to the other).
>
> So, if you defined sqrt(x) to return both the positive and negative results, then the intrvl result would
> contain a lot of extra values in many cases. sqrt([4, 9]) would be [-3, 3]. That’s why I would prefer
> sqrt to be defined to return only the positive results so that you get [2,3] and can negate it if needed.
>
> Are there counter-arguments? Am I missing something here?
> (I don’t see the issue you are raising in your BIG QUESTION… so, most likely I am missing
> something… please educate me.)
>
> On Q4, yeah in the world of intervals, Taylor expansions often make things much worse.
> I avoid them like the plague. Other than a few special cases, does anyone do otherwise?
>
> On Q6, you’ll be rounding down if using the endpoint or extrema as inf, and you’ll be rounding up
> if using the endpoint or extrema as sup. You can pre-determine what region the endpoint is in,
> whether that region is rising or falling, so you know if it is candidate sup or candidate inf…
> it’ll never be both.
>
> Final note: if you’ve already implemented interval cos(x) in Frink, I’d think you could borrow
> some of the same logic for sinc(x)… same peaks and valleys except for near 0 ([-pi/2, pi/2])…
> on top of that you need to add on evaluation of the peaks closest to zero (except for 0, which
> you know is 1… more like cos(x)). If that’s not so, I’d be curious why.
>
> Cheers,
> Brian
>
>
>> On Jul 7, 2019, at 10:07 AM, Brian Kennedy <BrianK at targetedconvergence.com> wrote:
>>
>> Hi Alan,
>>
>> You know where the peaks are: 0, n * 2pi + pi/2 for positive n, n * 2pi - pi/2 for negative n
>> You know where the valleys are: n * 2pi - pi/2 for positive n, n * 2pi + pi/2 for NEGATIVE n
>> (I think I got those right… but if not, it is something like that.)
>>
>> And you know the largest magnitude extrema will be the ones closest to 0.
>>
>> So, to evaluate sinc(x), you evaluate at the endpoints and at the extrema within the endpoints that are closest to zero.
>> That avoids/handles the dependency problem for this case.
>>
>> Brian
>>
>>
>>> On Jul 7, 2019, at 4:31 AM, Alan Eliasen <eliasen at mindspring.com> wrote:
>>>
>>>
>>> I thought I would take a few minutes and implement the sinc function
>>> in my programming language Frink ( https://frinklang.org/ ) and then 2
>>> hours later realized it wasn't going to be that simple.
>>>
>>> As you all know, the sinc function is defined as:
>>>
>>> sin[x]/x
>>>
>>> where the value at x=0 is defined to be 1 because the limit converges
>>> to 1 at this point. That's the only special thing about this function.
>>>
>>> However, a version of this function over intervals has several issues:
>>>
>>> 1.) If the interval contains zero, the supremum of the result should
>>> be exactly 1. (Or should it? See 3.)
>>>
>>> 2.) The function is non-monotonic, so we can't evaluate it at just
>>> its endpoints. For example, if we evaluated the endpoints at 2 and 7,
>>> we would miss a local minimum.
>>>
>>> 3.) This function is the embodiment of the "dependence problem" or
>>> the "overestimation problem": that is, if a variable is used multiple
>>> times in an expression, as it is in sin[x]/x, then its interval bound
>>> may be larger than if each bound were computed naively. This has
>>> implications on your interpretations of intervals. These are:
>>>
>>> 3.a.) An interval contains its result somewhere between its bounds,
>>> but we're not sure where.
>>>
>>> 3.b) An interval contains *all* of the values between its bounds
>>> simultaneously.
>>>
>>> 4.) One could write the Taylor series expansion of sin[x] as
>>>
>>> sin[x] = x - x^3/3! + x^5/5! + x^7/7! ...
>>>
>>> and the expansion of sinc[x] = sin[x]/x as:
>>>
>>> sinc[x] = 1 - x^2/3! + x^4/5! + x^6/7! ...
>>>
>>> which begs the question of the dependence problem by probably making
>>> it much worse in most cases.
>>>
>>>
>>> 5.) A cool thing about the sinc[x] function is that it has its local
>>> extrema at exactly the points that sinc[x] intersects cos[x]. Can you
>>> leverage this with respect to the sinc function? Can your mathematical
>>> system solve this exactly?
>>>
>>> 6.) When evaluating the bounds of sinc[x], and its endpoints, where
>>> do you round down and round up?
>>>
>>>
>>> THE BIG QUESTION: Has anyone done analysis of the sinc function with
>>> respect to real intervals and provided opinion on the way it should be
>>> treated? Because any implementation of this function is dependent on
>>> your interpretation of 3.) and the "dependency problem."
>>>
>>> Solving sinc[x] naively gives wider bounds than we may want in any
>>> other case.
>>>
>>> --
>>> Alan Eliasen
>>> eliasen at mindspring.com
>>> https://futureboy.us/
>>> _______________________________________________
>>> reliable_computing mailing list
>>> reliable_computing at lists.louisiana.edu
>>> https://lists.louisiana.edu/mailman/listinfo/reliable_computing
>>
>> _______________________________________________
>> reliable_computing mailing list
>> reliable_computing at lists.louisiana.edu
>> https://lists.louisiana.edu/mailman/listinfo/reliable_computing
>
> _______________________________________________
> reliable_computing mailing list
> reliable_computing at lists.louisiana.edu
> https://lists.louisiana.edu/mailman/listinfo/reliable_computing
>
More information about the reliable_computing
mailing list