Title Link: https://www.acwing.com/problem/content/description/302/

Given n tasks, each task needs T[i] time to complete. The cost of completing at the j time is C[i]. The tasks are completed in batches. The completion time of each batch is the same. Each batch has an S time to start the machine.

Ask the minimum cost of completing these tasks.

After optimizing the naive algorithm, an O(N^2) algorithm can be obtained. The subsequent impact (start-up time) will be added first to optimize a "batch" dimension.

code:

#include<iostream> #include<cstdio> #include<cstring> using namespace std; typedef long long ll; const int N = 5010; ll f[N]; ll sumT[N],sumC[N]; int main(){ int t,c; int n,S; cin>>n>>S; for(int i=1;i<=n;i++){ cin>>t>>c; sumT[i]=sumT[i-1]+t; sumC[i]=sumC[i-1]+c; } memset(f,0x3f,sizeof(f)); f[0]=0; for(int i=1;i<=n;i++){ for(int j=0;j<i;j++){ f[i]=min(f[i],f[j]+sumT[i]*(sumC[i]-sumC[j])+S*(sumC[n]-sumC[j])); } } cout<<f[n]<<endl; }

Optimization 1: because sumC[i] and sumT[i] are monotonic, it can be known through the linear programming method that each point can become the optimal decision point when the slope of adjacent two points increases. Therefore, it is only necessary to maintain the two endpoints of the edge with increasing slope through a monotonic queue.

Note that in the initial decision, 0 needs to be added. The total time complexity is O(N).

code:

#include<iostream> #include<cstdio> using namespace std; const int N = 300010; typedef long long ll; ll f[N]; ll sumt[N],sumc[N]; int n,s; int q[N]; int main(){ cin>>n>>s; for(int i=1;i<=n;i++){ int t,c; scanf("%d%d",&t,&c); sumt[i]=sumt[i-1]+t; sumc[i]=sumc[i-1]+c; } f[0]=0; int l=1,r=1; q[1]=0;//take(0,0)Add points to decision for(int i=1;i<=n;i++){ while(l<r && (f[q[l+1]]-f[q[l]] <= (s+sumt[i])*(sumc[q[l+1]]-sumc[q[l]])))l++;//Exclude decisions that must not be selected if(l<=r) f[i]=f[q[l]]-(s+sumt[i])*sumc[q[l]]+ sumt[i]*sumc[i]+s*sumc[n]; while(l<r && (f[q[r]]-f[q[r-1]])*(sumc[i]-sumc[q[r]]) >= (f[i]-f[q[r]])*(sumc[q[r]]-sumc[q[r-1]]))r--; q[++r]=i; } cout<<f[n]<<endl; }

Optimization 2: when there is a negative number in c[i], the slope no longer satisfies the monotonicity when adding an I every time. Therefore, if you delete the one with a small slope in the head of the queue according to the previous monotonic queue, then you will fail to find the matching point of a straight line with a small slope later. Therefore, the one with a small slope needs to be retained. At this time, you need to maintain a complete convex shell. Search a suitable point through dichotomy to meet that the slope on the left is less than k and the slope on the right is greater than or equal to k.

In the calculation, two long long are used to multiply, so it needs to become__ int64 or double, and then compare the size.

code:

#include<iostream> #include<cstdio> using namespace std; typedef long long ll; const int N = 300010; ll f[N]; int n,s; ll sumc[N],sumt[N]; int q[N]; int l,r; int binary_search(ll k){//Search for first greater than k Position of slope int L=l,R=r; if(l==r)return q[l]; while(L<R){ int mid=(L+R)>>1; if(f[q[mid+1]]-f[q[mid]] > (double)k*(sumc[q[mid+1]]-sumc[q[mid]]))R=mid; else L=mid+1; } return q[L]; } int main(){ cin>>n>>s; for(int i=1;i<=n;i++){ int t,c; scanf("%d%d",&t,&c); sumt[i]=sumt[i-1]+t; sumc[i]=sumc[i-1]+c; } f[0]=0; l=1;r=1; q[l]=0; for(int i=1;i<=n;i++){ int p=binary_search(sumt[i]+s);//Get index f[i]=f[p]-(double)(s+sumt[i])*sumc[p]+ sumt[i]*sumc[i]+s*sumc[n]; while(l<r && (double)(f[q[r]]-f[q[r-1]])*(sumc[i]-sumc[q[r]]) >= (double)(f[i]-f[q[r]])*(sumc[q[r]]-sumc[q[r-1]]))r--; q[++r]=i; } printf("%lld\n",f[n]); return 0; }