光线追踪中的BRDF和球面坐标

时间:2015-11-29 21:52:51

标签: c++ math raytracing coordinate-systems

我开发了一种使用标准phong / blinn phong照明模型的光线追踪器。现在我正在修改它以支持基于物理的渲染,所以我正在实现各种BRDF模型。目前我专注于Oren-Nayar和Torrance-Sparrow模型。这些中的每一个都基于用于表示入射wi和出射光方向的球面坐标。

我的问题是:哪种方式将wi和wo从笛卡尔坐标转换为球坐标?

我正在应用此处报告的标准公式https://en.wikipedia.org/wiki/Spherical_coordinate_system#Coordinate_system_conversions,但我不确定我做的是正确的,因为我的向量不是尾部的笛卡尔坐标系,但是以光线与物体的交点为中心。

在这里您可以找到我当前的实现:

任何人都可以帮我解释将wi和wo矢量从笛卡尔坐标转换为球坐标的正确方法吗?

更新

我在这里复制代码的相关部分:

球面坐标计算

float Vector3D::sphericalTheta() const {

    float sphericalTheta = acosf(Utils::clamp(y, -1.f, 1.f));

    return sphericalTheta;
}

float Vector3D::sphericalPhi() const {

    float phi = atan2f(z, x);

    return (phi < 0.f) ? phi + 2.f * M_PI : phi;
}

Oren Nayar

OrenNayar::OrenNayar(Spectrum<constant::spectrumSamples> reflectanceSpectrum, float degree) : reflectanceSpectrum{reflectanceSpectrum} {

    float sigma = Utils::degreeToRadian(degree);
    float sigmaPowerTwo = sigma * sigma;

    A = 1.0f - (sigmaPowerTwo / 2.0f * (sigmaPowerTwo + 0.33f));
    B = 0.45f * sigmaPowerTwo / (sigmaPowerTwo + 0.09f);
};

Spectrum<constant::spectrumSamples> OrenNayar::f(const Vector3D& wi, const Vector3D& wo, const Intersection* intersection) const {

    float thetaI = wi.sphericalTheta();
    float phiI = wi.sphericalPhi();

    float thetaO = wo.sphericalTheta();
    float phiO = wo.sphericalPhi();

    float alpha = std::fmaxf(thetaI, thetaO);
    float beta = std::fminf(thetaI, thetaO);

    Spectrum<constant::spectrumSamples> orenNayar = reflectanceSpectrum * constant::inversePi * (A + B * std::fmaxf(0, cosf(phiI - phiO) * sinf(alpha) * tanf(beta)));

    return orenNayar;
}

托兰斯-麻雀

float TorranceSparrow::G(const Vector3D& wi, const Vector3D& wo, const Vector3D& wh, const Intersection* intersection) const {

    Vector3D normal = intersection->normal;
    normal.normalize();

    float normalDotWh = fabsf(normal.dot(wh));
    float normalDotWo = fabsf(normal.dot(wo));
    float normalDotWi = fabsf(normal.dot(wi));
    float woDotWh = fabsf(wo.dot(wh));

    float G = fminf(1.0f, std::fminf((2.0f * normalDotWh * normalDotWo)/woDotWh, (2.0f * normalDotWh * normalDotWi)/woDotWh));

    return G;
}

float TorranceSparrow::D(const Vector3D& wh, const Intersection* intersection) const {

    Vector3D normal = intersection->normal;
    normal.normalize();

    float cosThetaH = fabsf(wh.dot(normal));

    float Dd = (exponent + 2) * constant::inverseTwoPi * powf(cosThetaH, exponent);

    return Dd;
}

Spectrum<constant::spectrumSamples> TorranceSparrow::f(const Vector3D& wi, const Vector3D& wo, const Intersection* intersection) const {

    Vector3D normal = intersection->normal;
    normal.normalize();

    float thetaI = wi.sphericalTheta();
    float thetaO = wo.sphericalTheta();

    float cosThetaO = fabsf(cosf(thetaO));
    float cosThetaI = fabsf(cosf(thetaI));

    if(cosThetaI == 0 || cosThetaO == 0) {

        return reflectanceSpectrum * 0.0f;
    }

    Vector3D wh = (wi + wo);
    wh.normalize();

    float cosThetaH = wi.dot(wh);

    float F = Fresnel::dieletricFresnel(cosThetaH, refractiveIndex);
    float g = G(wi, wo, wh, intersection);
    float d = D(wh, intersection);

    printf("f %f g %f d %f \n", F, g, d);
    printf("result %f \n", ((d * g * F) / (4.0f * cosThetaI * cosThetaO)));

    Spectrum<constant::spectrumSamples> torranceSparrow = reflectanceSpectrum * ((d * g * F) / (4.0f * cosThetaI * cosThetaO));

    return torranceSparrow;
}

更新2

经过一番搜索,我发现了Oren-Nayar BRDF的实施。

http://content.gpwiki.org/index.php/D3DBook:(Lighting)_Oren-Nayar

在上面的实现中,wi和wo的theta只需要做arccos(wo.dotProduct(Normal))和arccos(wi.dotProduct(Normal))。这对我来说似乎是合理的,因为我们可以使用交点的法线作为球面坐标系的天顶方向并进行计算。伽马= cos(phi_wi - phi_wo)的计算对wi和wo进行某种投射,称之为“切线空间”。假设在这个实现中一切都正确,我可以只使用公式|视图 - 普通x(View.dotProduct(普通))|和| Light - Normal x(Light.dotProduct(Normal))|获得phi坐标(而不是使用arctan(“something”))?

0 个答案:

没有答案