Bible Pay

Read 410473 times

  • Rob Andrews
  • Administrator

    • 4090


    • 97
    • June 05, 2017, 08:09:04 PM
    • Patmos, Island Of
    more
On the Topic of Ryzen 7 1700 (3.0GHz),

apparently they can go up to 4.1GHz or so and I think thats what the Top Ryzen 7 1700 machines on Rosetta are using to get 10k RAC
https://boinc.bakerlab.org/rosetta/show_host_detail.php?hostid=3194869
https://boinc.bakerlab.org/rosetta/show_host_detail.php?hostid=3291011

I have both of my ryzens now overclocked to 3.8GHz

and for the i7 2600k I bumped it from 3.4GHz to 3.8GHz yesterday.

I have no experience overclocking so this has been interesting

Wow those are beasts!  So I should be looking for 10,000 RAC out of this baby.  Ok, Ill keep it running, not going to overclock it til we are in prod LOL.

I think it was the sleep setting in win7: the default was to sleep after 30 minutes.  Its making more RAC than the Zotac sleeping and I only disturbed it once per day to wake it up by accident, so much for green power...  Well at least the solar project might be fun in the future.



  • T-Mike
  • Sr. Member

    • 375


    • 2
    • February 06, 2018, 06:12:58 PM
    more
Wow those are beasts!  So I should be looking for 10,000 RAC out of this baby.  Ok, Ill keep it running, not going to overclock it til we are in prod LOL.

I think it was the sleep setting in win7: the default was to sleep after 30 minutes.  Its making more RAC than the Zotac sleeping and I only disturbed it once per day to wake it up by accident, so much for green power...  Well at least the solar project might be fun in the future.

I'm actually surprised that it only gets 10K RAC, my cpu which has a lower benchmark score is getting 10K credits a day.


  • orbis
  • Full Member

    • 215


    • 7
    • February 08, 2018, 04:37:14 PM
    more
Wow this is awesome, Thanks!!!!
Payout from last superblock without problem in old wallet with old magnitude... nice :)


  • T-Mike
  • Sr. Member

    • 375


    • 2
    • February 06, 2018, 06:12:58 PM
    more
This is a relatively impressive spreadsheet, in that I see that it uses the correct exponent function (2.71) that BOINC uses and the decay formula appears to be correct, but I see a few bugs in the sheet that causes it to lag over time:

- This does not take into consideration the compounding effect of RAC.  RAC is compounded at least once per day (roughly) when a host solves a WU.  For example, lets say machine A had zero rac, and worked for 24 hours to solve a 100 RAC workunit, once per day.  That means it would check in One Day after epoch, 86400 in since inception with 100 credits, and while those 100 are granted, its first decay point would be ONE day, its RAC IN would be the age of its PRIOR RAC (of 0).  On Day 2, its prior rac of 100 would be 2 days old and its new 100 credit would be 1 day old.  With a grand total of 189 rac on day 2.


So you must add a running total to this to show the prior days RAC total, and the new total so that they always are able to be decayed separately per the formula (Not 0 for one and the grand total for the IN).

- If the machine is crunching 100 new Credits per day, then the grand total in 4 weeks must be a RAC of 903 (not 100).  100 Credits per day means 100+100+100... = 500 RAC on week 1...  Not 50 rac on week 1.

So I think if you add a running total for Prior days End and New days begin it will match boinc. 
A 100 RAC input should yield a 920~ or so RAC at the end of the 31 day period...

Great job!

Let's clear up one thing first, when you say 100 RAC input, do you mean you are actually RECEIVING 100 credits per day? Because it doesn't make any sense if the RAC would be 920 if your only receiving 100 credits per day. Your RAC should be 100.

The RAC on T-COMPUSTICK is actually pretty close to what the excel is giving me. I receive about 1080 credits/day and I'm getting a RAC of 873 on the website and 899 on the excel file. The computer has been on for about 18 days.


  • T-Mike
  • Sr. Member

    • 375


    • 2
    • February 06, 2018, 06:12:58 PM
    more
Let's see if my estimate is accurate after 4 weeks:
My computer with an ATOM processor gets about half the CPU mark as the CPU you have in your computer and has the same number or cores. Your Ryzen 1700x estimation comes from comparing it to 2 different CPUs with known credits/day and by using the scores from cpubenchmark.net. The predictions are for full CPU usage without any interference.

Zotac: 1,800RAC max
1700x: 17,305RAC max

Revised quote:
Zotac: 1,800 RAC max
1700x: 15,464 RAC max

Updated because I was using a wrong RAC base number in my calculation for the 1700x.


  • Rob Andrews
  • Administrator

    • 4090


    • 97
    • June 05, 2017, 08:09:04 PM
    • Patmos, Island Of
    more
Let's clear up one thing first, when you say 100 RAC input, do you mean you are actually RECEIVING 100 credits per day? Because it doesn't make any sense if the RAC would be 920 if your only receiving 100 credits per day. Your RAC should be 100.

The RAC on T-COMPUSTICK is actually pretty close to what the excel is giving me. I receive about 1080 credits/day and I'm getting a RAC of 873 on the website and 899 on the excel file. The computer has been on for about 18 days.
Sorry, I meant if you are receiving 1000 per day, Yes you would have a RAC of 940 or so by the 4th week yes, Close to what the sheet says.

But without compounding and tracking each days old rac, your sheet shows that on the 14th day the user would only have accrued 75% of the RAC, when in reality it would be closer to 83% (due to compounding).  That would put the sheet on par with what I would expect to have as a researcher and on par with what the boinc formula is doing when you plug in each days old rac plus each days new rac into the function.

And thats why I made a comment a week ago about being able to compare two machines as long as they were both on for 14 days- for all intents and purposes the RAC has stabilized after that point and they can be compared.



  • rastiks
  • Newbie

    • 22


    • 1
    • February 11, 2018, 05:48:57 AM
    more
As far as testing these things:
 -  Allow unbanked to be compensated without PODC Updates
We need to ask Rastiks to not shut down the cell phone, and not add any non-ARM rac.
Looking in my Sanctuary for Rastiks, I dont see your CPID in the list : fe553a955f0e21d46724858870014cbe.  (I found your CPID by looking at the unbanked report in the pool), anyway I see Rastiks is on the team, but your not associated in Biblepay (exec associate).  I see this by typing exec datalist dcc and do not see your CPID.  This would be a nice feature to test, so if either Rastiks could you please associate your cpid or someone else could start an unbanked CPID also?

sorry, I was off for few days, but I associated my cpid weeks ago:

{
  "Command": "getboincinfo",
  "CPID": "fe553a955f0e21d46724858870014cbe",
  "Address": "yY6uBmPyks4VEBjZ18Kzw1LUNiTewAXSr3",
  "CPIDS": "fe553a955f0e21d46724858870014cbe;",
  "CPID-Age (hours)": 422182,

Will continue running boinc on phone+tablet.


  • Rob Andrews
  • Administrator

    • 4090


    • 97
    • June 05, 2017, 08:09:04 PM
    • Patmos, Island Of
    more
sorry, I was off for few days, but I associated my cpid weeks ago:

{
  "Command": "getboincinfo",
  "CPID": "fe553a955f0e21d46724858870014cbe",
  "Address": "yY6uBmPyks4VEBjZ18Kzw1LUNiTewAXSr3",
  "CPIDS": "fe553a955f0e21d46724858870014cbe;",
  "CPID-Age (hours)": 422182,

Will continue running boinc on phone+tablet.


I dont see you in the chain; were up to 1.0.9.5 (had a couple mandatories that erased the chain) - Im looking in exec datalist dcc and dont see you - were on block 11533 please see if synced?
Blockhash for 11500: ccb74250c378c051ff560e40fcfe4bcb194f965b41eb595bb4a7af9c5192dd5e



  • T-Mike
  • Sr. Member

    • 375


    • 2
    • February 06, 2018, 06:12:58 PM
    more
Sorry, I meant if you are receiving 1000 per day, Yes you would have a RAC of 940 or so by the 4th week yes, Close to what the sheet says.

But without compounding and tracking each days old rac, your sheet shows that on the 14th day the user would only have accrued 75% of the RAC, when in reality it would be closer to 83% (due to compounding).  That would put the sheet on par with what I would expect to have as a researcher and on par with what the boinc formula is doing when you plug in each days old rac plus each days new rac into the function.

And thats why I made a comment a week ago about being able to compare two machines as long as they were both on for 14 days- for all intents and purposes the RAC has stabilized after that point and they can be compared.

I went through the code shown on the wiki and it doing exactly what the posted formula does. On my spreadsheet, I did not use the old RAC because it doesn't matter since the new credit per day is the same everyday, it would have mattered if the credits/per day was changing. I've updated the spreadsheet but it still shows 75% at 14 days. Are you saying the compounding is not in the formula?

Code: [Select]
void update_average(
    double work_start_time,         // when new work was started
                                    // (or zero if no new work)
    double work,                    // amount of new work
    double half_life,
    double& avg,                    // average work per day (in and out)
    double& avg_time                // when average was last computed
) {
    double now = dtime();

    if (avg_time) {
        // If an average R already exists, imagine that the new work was done
        // entirely between avg_time and now.
        // That gives a rate R'.
        // Replace R with a weighted average of R and R',
        // weighted so that we get the right half-life if R' == 0.
        //
        // But this blows up if avg_time == now; you get 0*(1/0)
        // So consider the limit as diff->0,
        // using the first-order Taylor expansion of
        // exp(x)=1+x+O(x^2).
        // So to the lowest order in diff:
        // weight = 1 - diff ln(2) / half_life
        // so one has
        // avg += (1-weight)*(work/diff_days)
        // avg += [diff*ln(2)/half_life] * (work*SECONDS_PER_DAY/diff)
        // notice that diff cancels out, leaving
        // avg += [ln(2)/half_life] * work*SECONDS_PER_DAY

        double diff, diff_days, weight;

        diff = now - avg_time;
        if (diff<0) diff=0;

        diff_days = diff/SECONDS_PER_DAY;
        weight = exp(-diff*M_LN2/half_life);

        avg *= weight;

        if ((1.0-weight) > 1.e-6) {
            avg += (1-weight)*(work/diff_days);
        } else {
            avg += M_LN2*work*SECONDS_PER_DAY/half_life;
        }
    } else if (work) {
        // If first time, average is just work/duration
        //
        double dd = (now - work_start_time)/SECONDS_PER_DAY;
        avg = work/dd;
    }
    avg_time = now;
}


  • Rob Andrews
  • Administrator

    • 4090


    • 97
    • June 05, 2017, 08:09:04 PM
    • Patmos, Island Of
    more
I went through the code shown on the wiki and it doing exactly what the posted formula does. On my spreadsheet, I did not use the old RAC because it doesn't matter since the new credit per day is the same everyday, it would have mattered if the credits/per day was changing. I've updated the spreadsheet but it still shows 75% at 14 days. Are you saying the compounding is not in the formula?

Code: [Select]
void update_average(
    double work_start_time,         // when new work was started
                                    // (or zero if no new work)
    double work,                    // amount of new work
    double half_life,
    double& avg,                    // average work per day (in and out)
    double& avg_time                // when average was last computed
) {
    double now = dtime();

    if (avg_time) {
        // If an average R already exists, imagine that the new work was done
        // entirely between avg_time and now.
        // That gives a rate R'.
        // Replace R with a weighted average of R and R',
        // weighted so that we get the right half-life if R' == 0.
        //
        // But this blows up if avg_time == now; you get 0*(1/0)
        // So consider the limit as diff->0,
        // using the first-order Taylor expansion of
        // exp(x)=1+x+O(x^2).
        // So to the lowest order in diff:
        // weight = 1 - diff ln(2) / half_life
        // so one has
        // avg += (1-weight)*(work/diff_days)
        // avg += [diff*ln(2)/half_life] * (work*SECONDS_PER_DAY/diff)
        // notice that diff cancels out, leaving
        // avg += [ln(2)/half_life] * work*SECONDS_PER_DAY

        double diff, diff_days, weight;

        diff = now - avg_time;
        if (diff<0) diff=0;

        diff_days = diff/SECONDS_PER_DAY;
        weight = exp(-diff*M_LN2/half_life);

        avg *= weight;

        if ((1.0-weight) > 1.e-6) {
            avg += (1-weight)*(work/diff_days);
        } else {
            avg += M_LN2*work*SECONDS_PER_DAY/half_life;
        }
    } else if (work) {
        // If first time, average is just work/duration
        //
        double dd = (now - work_start_time)/SECONDS_PER_DAY;
        avg = work/dd;
    }
    avg_time = now;
}

It looks like this sheet is exactly accurate.

I believe I need to update my prior RAC assessment to say "The lions share of RAC is realized by day 20" (with 86%), and very little (13% happens) between day 21 and 40.

I just added the rac decay function with in and out to the next version of the client and it agrees with your latest spreadsheet.





  • T-Mike
  • Sr. Member

    • 375


    • 2
    • February 06, 2018, 06:12:58 PM
    more
It looks like this sheet is exactly accurate.

I believe I need to update my prior RAC assessment to say "The lions share of RAC is realized by day 20" (with 86%), and very little (13% happens) between day 21 and 40.

I just added the rac decay function with in and out to the next version of the client and it agrees with your latest spreadsheet.

I'm glad we finally got it sorted out!


  • Rob Andrews
  • Administrator

    • 4090


    • 97
    • June 05, 2017, 08:09:04 PM
    • Patmos, Island Of
    more
I'm glad we finally got it sorted out!
Me too.

And that means what 616West originally said was correct on his part, sorry about that 616.


  • jaapgvk
  • Hero Member

    • 558


    • 31
    • September 01, 2017, 08:02:57 PM
    • Netherlands
    more
Me too.

And that means what 616West originally said was correct on his part, sorry about that 616.

Since a few days I was running my android phone on a separate CPID to test the 'unbanked' function, but forgot to join the Biblepay-team. Oops  :o

I joined now, so we'll see how it goes.

About the implementation in March Rob: I'm all for intensive testing to make sure that no bugs escape us. Do you think it would be beneficial to get more people testing?


  • Rob Andrews
  • Administrator

    • 4090


    • 97
    • June 05, 2017, 08:09:04 PM
    • Patmos, Island Of
    more
Since a few days I was running my android phone on a separate CPID to test the 'unbanked' function, but forgot to join the Biblepay-team. Oops  :o

I joined now, so we'll see how it goes.

About the implementation in March Rob: I'm all for intensive testing to make sure that no bugs escape us. Do you think it would be beneficial to get more people testing?

Yes, the more the merrier.  Ill post something in the other forum now.



  • Rob Andrews
  • Administrator

    • 4090


    • 97
    • June 05, 2017, 08:09:04 PM
    • Patmos, Island Of
    more
I took a look at the logs on my two testnet nodes and dont see anything bad.  Other than a lot of unecessary logging, which I added switches for and removed (debugpodc=true).  And some Sanctuary mnpyaments logging was removed.

Can't find any other bugs... Lets hope we have some more volunteers help us asap.

Yes Jaap lets also not forget to verify your unbanked flag on my sanc later, let me know after you receive an unbanked payment.