Problem D. equation

时间限制 1000 ms   内存限制 256 MB

You are given two integers $N, C$ and two integer sequences $a$ and $b$ of length $N$. The sequences are indexed from $1$ to $N$.

Please solve the following equation for $x$:

$\sum\limits_{i=1}^{N}|a_i \cdot x + b_i |=C$, where $|v|$ means the absolute value of $v$.
 

输入数据

The first line contains an integer $T$ indicating there are $T$ tests. Each test consists of $N+1$ lines. The first line contains two integers $N, C$. The $i$-th line of following $N$ lines consists of two integers $a_i, b_i$.

* $1 \le T \le 50$

* $1 \le N \le 10^5$

* $1 \le a_i \le 1000$

* $-1000 \le b_i \le 1000$

* $1 \le C \le 10^9$

* only $5$ tests with $N$ larger than $1000$
 

输出数据

For each test, output one line.
If there are an infinite number of solutions, this line consists only one integer $-1$.
Otherwise, this line first comes with an integer $m$ indicating the number of solutions, then you must print $m$ fractions from the smallest to the largest indicating all possible answers. (It can be proved that all solutions can be written as fractions). The fraction should be in the form of "a/b" where a must be an integer, b must be a positive integer, and $gcd(abs(a),b)=1$. If the answer is $0$, you should output "0/1".
 

样例输入

复制
4
2 3
1 2
1 -1
3 3
2 1
2 2
2 3
2 1
3 5
4 -1
3 2
1 -1
1 -2
1 -3

样例输出

复制
-1
2 -3/2 -1/2
0
1 2/1

提交

请先 登录

© 2025 FAQs Contact About