Problem D. equation
时间限制 1000 ms
内存限制 256 MB
You are given two integers $N, C$ and two integer sequences $a$ and $b$ of length $N$. The sequences are indexed from $1$ to $N$.
Please solve the following equation for $x$:
$\sum\limits_{i=1}^{N}|a_i \cdot x + b_i |=C$, where $|v|$ means the absolute value of $v$.
输入数据
输出数据
For each test, output one line.
If there are an infinite number of solutions, this line consists only one integer $-1$.
Otherwise, this line first comes with an integer $m$ indicating the number of solutions, then you must print $m$ fractions from the smallest to the largest indicating all possible answers. (It can be proved that all solutions can be written as fractions). The fraction should be in the form of "a/b" where a must be an integer, b must be a positive integer, and $gcd(abs(a),b)=1$. If the answer is $0$, you should output "0/1".
样例输入
复制
4
2 3
1 2
1 -1
3 3
2 1
2 2
2 3
2 1
3 5
4 -1
3 2
1 -1
1 -2
1 -3
样例输出
复制
-1
2 -3/2 -1/2
0
1 2/1
$ Mathjax font initiator $